avfoundation

Set maximum frame rate with AVFoundation in iOS 5

限于喜欢 提交于 2019-12-02 19:13:51
I believe this used to be done with captureOutput.minFrameDuration . However, this is deprecated in iOS 5. Instead I apparently need to use AVCaptureConnection 's video.minFrameDuration . So I have my input, my output, I add them both the the capture session - where can I get access to the capture connection? I think it is created for me by the session, but where? I could try adding the I/O using addInputWithNoConnections and addOutputWithNoConnections and then maybe creating the connection manually. But this seems like a bit of hassle just to set a maximum frame rate. Plus, Xcode complains

Compositing 2 videos on top of each other with alpha

社会主义新天地 提交于 2019-12-02 18:39:14
AVFoundation allows you to "compose" 2 assets (2 videos) as 2 "tracks", just like in Final Cut Pro, for example. The theory says I can have 2 videos on top of each other, with alpha, and see both. Either I'm doing something wrong, or there's a bug somewhere, because the following test code, although a bit messy, clearly states I should see 2 videos, and I only see one, as seen here: http://lockerz.com/s/172403384 -- the "blue" square is IMG_1388.m4v For whatever reason, IMG_1383.MOV is never shown. NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES]

AVCapture appendSampleBuffer

和自甴很熟 提交于 2019-12-02 18:37:21
I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of. Am making an iPhone app that uses AVFoundation - specifically AVCapture to capture video using the iPhone camera. I need to have a custom image that is overlayed on the video feed included in the recording. So far I have the AVCapture session set up, can display the feed, access the frame, save it as a UIImage and marge the overlay Image onto it. Then convert this new UIImage into a CVPixelBufferRef. annnd to double check that the bufferRef is working I converted it back to a UIImage and

How to resume background audio in Swift 2 / AVPlayer?

非 Y 不嫁゛ 提交于 2019-12-02 18:35:01
I am learning Swift as my first programming language. I've struggled for many hours to resume background audio playback after interruption (eg call) What should happen: Audio keeps playing when app goes to background (works) When interrupted by a call, get the notification for interruption began (works) When call ends, get the notification for interruption ends (works) Resume playing the audio ( does NOT work - hear nothing ) Really appreciate any help! Thanks Notes: The app is registered for background audio and plays fine before interruption I have tried with and without the time delay to

Making video from UIImage array with different transition animations

a 夏天 提交于 2019-12-02 18:29:29
I am following this code to create Video from an UIImage Array . While transitioning from one image to another, there is no animation here. I want to add some photo transition effect like these : TransitionFlipFromTop TransitionFlipFromBottom TransitionFlipFromLeft TransitionFlipFromRight TransitionCurlUp TransitionCurlDown TransitionCrossDissolve FadeIn FadeOut These animations can be done via UIView.transition() & UIView.animate() . But how to apply these transition animations while making a video from an UIImage array? I have searched a lot but didn't find anything. I've also tried

Cropping a captured image exactly to how it looks in AVCaptureVideoPreviewLayer

感情迁移 提交于 2019-12-02 18:25:56
I have a photo app that is using AV Foundation. I have setup a preview layer using AVCaptureVideoPreviewLayer that takes up the top half of the screen. So when the user is trying to take their photo, all they can see is what the top half of the screen sees. This works great, but when the user actually takes the photo and I try to set the photo as the layer's contents, the image is distorted. I did research and realized that I would need to crop the image. All I want to do is crop the full captured image so that all that is left is exactly what the user could originally see in the top half of

What does shouldOptimizeForNetworkUse actually do?

心不动则不痛 提交于 2019-12-02 18:13:35
From the Apple documentation it just says: When the value of this property is YES, the output file will be written in such a way that playback can start after only a small amount of the file is downloaded. But what is actually happening? Matti Savolainen When shouldOptimizeForNetworkUse is set to YES calling finishWriting will move the MP4 moov atom (movie atom) from the end of the file to the beginning of the file. The moov atom contains information about the movie file like timescale and duration. The moov also contains "subatoms" which contain information like the tracks, the data offsets

Problem when attempting to loop AVPlayer (userCapturedVideo) seamlessly

雨燕双飞 提交于 2019-12-02 18:13:34
问题 I have been looking around for a while on how to correctly accomplish this. I have looked here and here. And have used the top answer here, to try and accomplish this however for me the recorded video does not ever even begin to loop. The first frame shows up but does not play the video, thus I am wondering what's wrong. func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) { if (error != nil) { print(

Image/Text overlay in video swift

若如初见. 提交于 2019-12-02 17:46:00
I am work with image overlay for watermark effect in video using swift.I am using AVFoundation for this but somehow I am not succeed. Following is my code for overlay image/text let path = NSBundle.mainBundle().pathForResource("sample_movie", ofType:"mp4") let fileURL = NSURL(fileURLWithPath: path!) let composition = AVMutableComposition() var vidAsset = AVURLAsset(URL: fileURL, options: nil) // get video track let vtrack = vidAsset.tracksWithMediaType(AVMediaTypeVideo) let videoTrack:AVAssetTrack = vtrack[0] as! AVAssetTrack let vid_duration = videoTrack.timeRange.duration let vid_timerange =

Does anyone know how to implement the AVAssetResourceLoaderDelegate methods correctly?

随声附和 提交于 2019-12-02 17:45:14
I am trying to coax AVFoundation to read from a custom URL. The custom URL stuff works. The code below creates a NSData with a movie file: NSData* movieData = [NSData dataWithContentsOfURL:@"memory://video"]; I've set up a AVAssetResourceLoader object using the following code: NSURL* url = [NSURL URLWithString:@"memory://video"]; AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil]; AVAssetResourceLoader* loader = [asset resourceLoader]; [loader setDelegate:self queue:mDispatchQueue]; The dispatch queue is concurrent. I then try to extract the first frame from the movie: