avfoundation

Setting and accessing iOS seek bar on lock screen

♀尐吖头ヾ 提交于 2019-12-09 13:09:03
问题 I am playing a audio file in iOS app using AVQueuePlayer/AVFoundation. i have set the MPNowPlayingInfoCenter's now playing information like album title, artist, Artwork, like this NSMutableDictionary *albumInfo = [[NSMutableDictionary alloc] init]; MPMediaItemArtwork *artworkP; UIImage *artWork = [UIImage imageNamed:album.imageUrl]; [albumInfo setObject:album.title forKey:MPMediaItemPropertyTitle]; [albumInfo setObject:album.auther forKey:MPMediaItemPropertyArtist]; [albumInfo setObject:album

Composing video and audio using AVMutableComposition

我与影子孤独终老i 提交于 2019-12-09 11:39:35
问题 I have a weird problem. In my app I am combining multiple audio and video files using the code below. The resulted video seems to work fine once I downloaded it from the device to the computer and play with Quick Time, but whenever I am trying to play the newly composed video using either UIWebView or AVPLayer I can only see first part of merged video files. Furthermore when I tried to use MPMoviePlayerController to play it hangs on "Loading". I can hear audio for all composition. To make it

iOS 5: Error merging 3 videos with AVAssetExportSession

孤人 提交于 2019-12-09 10:51:24
问题 I'm trying to merge (append) 3 videos using AVAssetExportSession, but I keep getting this error. Weirdly for 1 or 2 videos it worked. Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x458120 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export} I even tried to redo the function in case of error but what I got is only infinite error message. This is the snippet of my code. AVMutableComposition *mixComposition =

AVAssetExportSession fails every time (error -12780)

一曲冷凌霜 提交于 2019-12-09 09:28:48
问题 I'm trying to merge some audio files (picked via MPMediaPickerController), but the export always fails with error code -12780. When I try to play my composition with an AVPlayer object, it plays correctly. Just the export fails. What am I doing wrong? This is my code: AVAssetExportSession *exportSession; AVPlayer *player; - (void)mergeAudiofiles { // self.mediaItems is an NSArray of MPMediaItems if (self.mediaItems.count == 0) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"

All black frames when trying to write Metal frames to Quicktime file with AVFoundation AVAssetWriter

こ雲淡風輕ζ 提交于 2019-12-09 06:55:03
问题 I'm using this Swift class (shown originally in the answer to this question: Capture Metal MTKView as Movie in realtime?) to try to record my Metal app frames to a movie file. class MetalVideoRecorder { var isRecording = false var recordingStartTime = TimeInterval(0) private var assetWriter: AVAssetWriter private var assetWriterVideoInput: AVAssetWriterInput private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor init?(outputURL url: URL, size: CGSize) { do { assetWriter

How to buffer audio using AVPlayer in iOS?

寵の児 提交于 2019-12-09 06:31:41
问题 I want to play stream audio from the Internet. I wrote code that plays stream but it don't have any buffer so if signal is weak application stop playing audio. This is my code: import UIKit import AVFoundation import MediaPlayer import AudioToolbox class ViewController: UIViewController { var playerItem:AVPlayerItem? var player:AVPlayer? @IBOutlet weak var PlayButton: UIButton! override func viewDidLoad() { super.viewDidLoad() var buffer = AVAudioBuffer () let url = NSURL (string: "http:/

Hold multiple Frames in Memory before sending them to AVAssetWriter

好久不见. 提交于 2019-12-09 06:22:27
问题 I need to hold some video frames from a captureSession in memory and write them to a file when 'something' happens. Similar to this solution, i use this code to put a frame into a NSMutableArray: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection { //... CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress

SKVideoNode (embedded in SKScene) as texture for for Scene Kit Node not working

戏子无情 提交于 2019-12-09 05:52:59
问题 I'm attempting to map a video as texture to a primitive cylinder for a VR project by using Scenekit: an SKVideoNode embedded in an SKScene as a texture for a SceneKit SCNTube object, and I just can't get video to display as a still image would. PLayground code below should generate moving video mapped to cylinder, but the mapping does not work: EDIT: ADDED SINGLE LINE AT END OF LISTING TO FIX. CODE BELOW SHOULD WORK import UIKit import SceneKit // for 3D mapping import SpriteKit // for

AVFoundation decode prores4444 movie with alpha channel

我怕爱的太早我们不能终老 提交于 2019-12-09 04:30:33
I'm trying to decode a prores4444 video with alpha channel on iOS with Swift to overlay as a complex animation over a user video and to export it to his library. The AVFoundation documentation is not that great and I'm struggling to find any code examples. When I try to use the code below with AVAssetReaderTrackOutput to decode the video I get an "AVAssetReaderOutput does not currently support compressed output" error. let avAssetReaderVideoCompositionOutput = AVAssetReaderVideoCompositionOutput(videoTracks: [videoOverlayAssetTrack], videoSettings: outputSettings: [AVVideoCodecKey:

Panning a mono signal with MultiChannelMixer & MTAudioProcessingTap

☆樱花仙子☆ 提交于 2019-12-09 03:45:28
I'm looking to pan a mono signal using MTAudioProcessingTap and a Multichannel Mixer audio unit, but am getting a mono output instead of a panned, stereo output. The documentation states: "The Multichannel Mixer unit (subtype kAudioUnitSubType_MultiChannelMixer) takes any number of mono or stereo streams and combines them into a single stereo output." So, the mono output was unexpected. Any way around this? I ran a stereo signal through the exact same code and everything worked great: stereo output, panned as expected. Here's the code from my tap's prepare callback: static void tap