avfoundation

How to add static and dynamic overlays to video with AVAssetWriter?

泪湿孤枕 提交于 2019-12-05 09:32:31
What's the right way to add an image overlay to a video created with AVAssetWriter? It's possible to do so with AVAssetExportSession, but this question is about how to do so with AVAssetWriter so there is more control over the quality and output. There are two scenarios: 1) Simple: Add single overlay that is present the entire duration of the video (similar to a watermark). 2) Complex: Add different overlays that animate in and out of the video at different times (similar to using AVVideoCompositionCoreAnimationTool). Tim Bull There's a lot of different approaches to this and the correct

Play video with AVPlayer after AVCaptureMovieFileOutput

寵の児 提交于 2019-12-05 08:38:41
I've created a custom camera similar to snapchat with AVFoundation. The picture aspect works: take a picture and I display the picture. Now I'm trying to take a video and display the video. I have captured a successful video file, tested and works with MPMoviePlayer, but am unable to play it back on AVPlayer. - (void)takeVideoAction:(id)sender { NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:@"output.mp4"]; NSFileManager *manager = [[NSFileManager alloc] init]; if ([manager fileExistsAtPath:outputPath]){ [manager removeItemAtPath:outputPath error:nil]; }

How to save a UIImage to documents directory?

拜拜、爱过 提交于 2019-12-05 08:19:01
I'm trying to save both a recorded video's file path, and a thumbnail from the video to the documents directory. Then, set those two values to an object using the file paths so I can use the object to populate a collection view. With the code I have currently (below), after I record a video, the video path gets saved to the documents directory, and the video path and thumbnail get set to my Post object, and the thumbnail appears properly in my collection view. All good so far. However only the video path persists between app re-launches since it's in the directory, and the thumbnail isn't. I'd

AVPlayer's video and audio become out of sync after pausing and then resuming video

北城余情 提交于 2019-12-05 07:46:56
I'm using AVPlayer to play videos in my app. Video playback always works perfectly, except for when you pause/resume the video. If the user presses the pause button, and then resumes the video, sometimes the audio will be ahead of the video. The video resumes at the correct location, but the audio is ahead. It's as if when you press pause, the audio keeps running. When they press the pause button, all I am doing is calling the pause method of the AVPlayer , and I have also tried setting it's rate property to 0.0 which is supposed to be the exact same thing according to the documentation. I

Getting Slow Motion meta data from captured video in iOS

。_饼干妹妹 提交于 2019-12-05 07:09:10
We have a video app that is importing video from the user's camera roll. Our problem is importing slo-mo video taken with the native Camera app. We can recognise that there is a higher than normal frame rate (e.g. 120 or 240 fps). What we can't find is the meta information that specifies when the video drops into slow motion and when it speeds up again. Does anyone know where this information is kept, and/or how to get at it? Is it in the file itself, or stored in a separate meta file somewhere? Any help would be hugely appreciated, thanks! The slow motion segments are technically not metadata

AVCaptureSession barcode scan

送分小仙女□ 提交于 2019-12-05 07:06:57
I'm currently working with AVCaptureSession and AVCaptureMetadataOutput . It works perfectly, but I just want to know how to indicate to scan and analyze metadata objects only on a specific region of the AVCaptureVideoPreviewLayer ? Here is a sample of code from a project I have that may help you on the right track // where 'self.session' is previously setup AVCaptureSession // setup metadata capture AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init]; [self.session addOutput:metadataOutput]; [metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main

convert .caf file to .wav file with progress bar in ios

回眸只為那壹抹淺笑 提交于 2019-12-05 07:06:52
问题 I record audio in .caf format and later need to convert it to .wav in order to upload the file as a dropbox. How can I convert the file to wav format in iOS? i don't want record audio directly .wav format how to Convert .CAF to .WAV file while conversion takes much more time i need to implement conversion progress bar 回答1: Try this -(void) convertToWav { // set up an AVAssetReader to read from the iPod Library NSString *cafFilePath=[[NSBundle mainBundle]pathForResource:@"test" ofType:@"caf"];

precise timing with AVMutableComposition

泄露秘密 提交于 2019-12-05 06:13:47
I'm trying to use AVMutableComposition to play a sequence of sound files at precise times. When the view loads, I create the composition with the intent of playing 4 sounds evenly spaced over 1 second. It shouldn't matter how long or short the sounds are, I just want to fire them at exactly 0, 0.25, 0.5 and 0.75 seconds: AVMutableComposition *composition = [[AVMutableComposition alloc] init]; NSDictionary *options = @{AVURLAssetPreferPreciseDurationAndTimingKey : @YES}; for (NSInteger i = 0; i < 4; i++) { AVMutableCompositionTrack* track = [composition addMutableTrackWithMediaType

Creating a AVAsset with a HTTP NSURL

≯℡__Kan透↙ 提交于 2019-12-05 05:55:57
问题 I'm trying to merge two NSURLs that contain video references. One of the urls point to a video on AWS and the other points to a video that is stored locally. My exporting code works because I've tried it with two local videos, but whenever I try merge the HTTP url and the local url I get this error: Error Domain=NSURLErrorDomain Code=-1100 "The requested URL was not found on this server." UserInfo=0x155d2f20 {NSUnderlyingError=0x155b4f60 "The operation couldn’t be completed. No such file or

Capturing iSight image using AVFoundation on Mac

若如初见. 提交于 2019-12-05 05:19:51
I previously had this code to capture a single image from a Mac's iSight camera using QTKit: - (NSError*)takePicture { BOOL success; NSError* error; captureSession = [QTCaptureSession new]; QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo]; success = [device open: &error]; if (!success) { return error; } QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device]; success = [captureSession addInput: captureDeviceInput error: &error]; if (!success) { return error; } QTCaptureDecompressedVideoOutput*