avassetwriter

iOS: How to make a movie with a series of images using AVAssetWriter

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-06 00:21:37
I have seen this question asked many times in different forms both here and in other forums. Some of the questions get answered, some do not. There are a few where the answerer or author claims to have had success. I have implemented the examples from those that claim success, but have yet to see the same results. I am able to successfully use AVAssetWriter (and AVAssetWriterInputPixelBufferAdaptor) to write image data and audio data simultaneously when the sample buffers are obtained from an AVCaptureSession. However, if I have CGImageRef's that were generated in some other way, and build a

AVfoundation Reverse Video

僤鯓⒐⒋嵵緔 提交于 2019-12-06 00:05:14
问题 I tried to make video in reverse. While playing asset in AVPlayer i set the rate = -1 to make it work in reverse format. But how to export that video? I looked into docs. read about avassetwrite, sambuffers , compositions but didn't find any way to do this. Below links are refered by me http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios Reverse video demo - https://github.com/mikaelhellqvist/ReverseClip Above example to no longer works in IOS 8. and even it is not

How to set expected framerate to AVAssetWriterInput

烂漫一生 提交于 2019-12-05 22:05:57
I have an app which encodes videos in different ways and saves it to Photos library - it can cut specific time range, add pictures, text, etc. Everything is working perfectly till I try to encode video 120+ fps. The problem is that video appears to be slow-motioned and I don't pursue that goal at all. Here I found out about property for AVAssetWritterInput which is called AVVideoExpectedSourceFrameRateKey , but the problem is that when I try to apply this parameter to my AVAssetWritterInput , I'm getting this error: *** Terminating app due to uncaught exception 'NSInvalidArgumentException',

Using AVAssetWriter with raw NAL Units

随声附和 提交于 2019-12-05 21:21:17
问题 I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded. The settings used for encoding the media appended to the output. Pass nil to specify that appended samples should not be re-encoded. I want to take advantage of this feature to pass in a stream of raw H.264 NALs, but I am having trouble adapting my raw byte streams into a CMSampleBuffer that I can pass into AVAssetWriterInput's

How to encode video with transparent background

无人久伴 提交于 2019-12-05 18:46:48
I am encoding a video using cocoa for OSX (with AVAssetWriter) in h264. This is the configuration: // Configure video writer AVAssetWriter *m_videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:@(outputFile)] fileType:AVFileTypeMPEG4 error:NULL]; // configure video input NSDictionary *videoSettings = @{ AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : @(m_width), AVVideoHeightKey : @(m_height) }; AVAssetWriterInput* m_writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; // Add video input into video writer [m

Memory Management issue with AVAssetWriter in iPhone?

久未见 提交于 2019-12-05 12:42:52
I have successfully created video from uiimages using AVAssetWriter. But as soon as the writer starts writing video theres a sudden rise in the memory allocation in the instruments. The spike in the memory allocation changes from 3-4 MB to 120MB and then cools off. I have used the following code for this... -(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size { NSMutableDictionary *attributes = [[NSMutableDictionary alloc]init]; [attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];

How do I use AVAssetWriter?

╄→尐↘猪︶ㄣ 提交于 2019-12-05 10:52:27
问题 I’d like to take some video frames and encode them into a video. It looks like that’s exactly what AVAssetWriter was meant for, but no matter how I eyeball the docs and Google I can’t find any way to actually use it. From the docs it looks like I need an input ( AVAssetWriterInput ) to feed the writer from. Fine. But the AVAssetWriterInput class is abstract and the only subclass that I know of in 4.1 is AVAssetWriterInputPixelBufferAdaptor that requires a AVAssetWriterInput in its initializer

How to add static and dynamic overlays to video with AVAssetWriter?

泪湿孤枕 提交于 2019-12-05 09:32:31
What's the right way to add an image overlay to a video created with AVAssetWriter? It's possible to do so with AVAssetExportSession, but this question is about how to do so with AVAssetWriter so there is more control over the quality and output. There are two scenarios: 1) Simple: Add single overlay that is present the entire duration of the video (similar to a watermark). 2) Complex: Add different overlays that animate in and out of the video at different times (similar to using AVVideoCompositionCoreAnimationTool). Tim Bull There's a lot of different approaches to this and the correct

Video Compression ios : Reduce the size of a video usin AVAssetWriter

老子叫甜甜 提交于 2019-12-04 23:44:58
问题 I am Successful in encoding a series of image captured by AVCaptureVideoPreviewLayer of AVFoundation Framework into a Video using AVAssetWriter and AVAssetWriterInput. But the size of the video is too large, Can anyone of you suggest me a tutorial or atleast a direct me in this case or give me the correct Video output Settings for video compression. I am using the following VideoOutputSetting. videoOutputSettings = @{ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: [NSNumber numberWithInt

How to control video frame rate with AVAssetReader and AVAssetWriter?

一世执手 提交于 2019-12-04 15:56:51
We are trying to understand how to control/specify the frame rate for videos that we are encoding with AVAssetReader and AVAssetWriter . Specifically, we are are using AVAssetReader and AVAssetWriter to transcode/encode/compress a video that we have accessed from the photo/video gallery. We are able to control things like bit rate, aspect ratio changes, etc., but cannot figure out how to control the frame rate. To be specific, we'd like to be able to take as input a 30 FPS video that's 5 minutes long and emit a 5 minute video at 15 FPS. Our current loop that processes sample buffers is: