avassetwriter

AVAssetWriter How to create mov video with no compression?

痞子三分冷 提交于 2019-11-29 05:12:55
I'am creating a video from an array of images. The purpose of my work is to create a .mov video with no compression. I have see in developer library that it exist a key "AVVideoCompressionPropertiesKey" but I don't know how to specify no compression with this key. Could you help me please? Here is my sample code : NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:320], AVVideoCleanApertureWidthKey, [NSNumber numberWithInt:480], AVVideoCleanApertureHeightKey, [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,

How do I control AVAssetWriter to write at the correct FPS

此生再无相见时 提交于 2019-11-29 04:08:54
Let me see if I understood it correctly. At the present most advanced hardware, iOS allows me to record at the following fps: 30, 60, 120 and 240. But these fps behave differently. If I shoot at 30 or 60 fps, I expect the videos files created from shooting at these fps to play at 30 and 60 fps respectively. But if I shoot at 120 or 240 fps, I expect the video files creating from shooting at these fps to play at 30 fps, or I will not see the slow motion. A few questions: am I right? is there a way to shoot at 120 or 240 fps and play at 120 and 240 fps respectively? I mean play at the fps the

How would I put together a video using the AVAssetWriter in swift?

别来无恙 提交于 2019-11-29 01:50:02
问题 I'm currently making a small app that timelapses the webcam on my mac, saves the captured frame to png, and I am looking into exporting the captured frames as a single video. I use CGImage to handle the original images and have them set in an array but I'm unsure on there to go from there. I gather from my own research that I have to use AVAssetWriter and AVAssetWriterInput somehow. I've had a look about on here, read the apple docs and searched google. But all the guides etc, are in obj-c

AVAssetWriterInput H.264 Passthrough to QuickTime (.mov) - Passing in SPS/PPS to create avcC atom?

旧时模样 提交于 2019-11-29 01:16:11
问题 I have a stream of H.264/AVC NALs consisting of types 1 (P frame), 5 (I frame), 7 (SPS), and 8 (PPS). I want to write them into an .mov file without re-encoding. I'm attempting to use AVAssetWriter to do this. The documentation for AVAssetWriterInput states: Passing nil for outputSettings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable

AVAssetWriter Woes

烂漫一生 提交于 2019-11-29 00:51:27
问题 I'm trying to use AVAssetWriter to write CGImages to a file to create a video from images. I've gotten this to work successfully in three different ways on the simulator, but every method fails on an iPhone 4 running iOS 4.3. This all has to do with pixel buffers. My first method was to just create the pixel buffers as needed without using a pool. That works, but is too memory intensive to work on the device. My second method was to use the recommended AVAssetWriterInputPixelBufferAdaptor and

Writing video + generated audio to AVAssetWriterInput, audio stuttering

元气小坏坏 提交于 2019-11-28 19:58:46
I'm generating a video from a Unity app on iOS. I'm using iVidCap, which uses AVFoundation to do this. That side is all working fine. Essentially the video is rendered by using a texture render target and passing the frames to an Obj-C plugin. Now I need to add audio to the video. The audio is going to be sound effects that occur at specific times and maybe some background sound. The files being used are actually assets internal to the Unity app. I could potentially write these to phone storage and then generate an AVComposition, but my plan was to avoid this and composite the audio in

Screen capture video in iOS programmatically

不问归期 提交于 2019-11-28 18:28:46
I'm trying to create a UIView that allows a user to tap a button in i and record the screen (not make a video from the camera), then save it to the document folder. I've seen a couple of SO articles here that talk about AVAssetWriter and make references to this link: http://codethink.no-ip.org/wordpress/archives/673 , but that link appears to be dead. But no one has actually shown a solution or provided any examples on how to accomplish. Anyone have any ideas or can point me in the right direction? This should be simpler than it is. Thanks, Doug The link is not dead. http://codethink.no-ip.org

Data corruption when reading realtime H.264 output from AVAssetWriter

拟墨画扇 提交于 2019-11-28 15:53:14
问题 I'm using some tricks to try to read the raw output of an AVAssetWriter while it is being written to disk. When I reassemble the individual files by concatenating them, the resulting file is the same exact number of bytes as the AVAssetWriter's output file. However, the reassembled file will not play in QuickTime or be parsed by FFmpeg because there is data corruption. A few bytes here and there have been changed, rendering the resulting file unusable. I assume this is occurring on the EOF

AVAssetWriter AVVideoExpectedSourceFrameRateKey (frame rate) ignored

放肆的年华 提交于 2019-11-28 12:52:27
Me and my team are trying to re-encode a video file to a more "gify" feeling by changing the video frame rate. We are using the following properties for the AVAssetWriterInput : let videoSettings:[String:Any] = [ AVVideoCodecKey: AVVideoCodecH264, AVVideoHeightKey: videoTrack.naturalSize.height, AVVideoWidthKey: videoTrack.naturalSize.width, AVVideoCompressionPropertiesKey: [AVVideoExpectedSourceFrameRateKey: NSNumber(value: 12)] ] But the output video keep playing in the normal frame rate (played using AVPlayer ). What is the right way to reduce video frame rate? (12 for example). Any help in

OpenGL ES 2.0 to Video on iPad/iPhone

偶尔善良 提交于 2019-11-27 18:53:15
I am at my wits end here despite the good information here on StackOverflow... I am trying to write an OpenGL renderbuffer to a video on the iPad 2 (using iOS 4.3). This is more exactly what I am attempting: A) set up an AVAssetWriterInputPixelBufferAdaptor create an AVAssetWriter that points to a video file set up an AVAssetWriterInput with appropriate settings set up an AVAssetWriterInputPixelBufferAdaptor to add data to the video file B) write data to a video file using that AVAssetWriterInputPixelBufferAdaptor render OpenGL code to the screen get the OpenGL buffer via glReadPixels create a