avassetwriter

Completion handler is not called on iOS7 GM

ε祈祈猫儿з 提交于 2019-11-30 06:47:05
I'm using AVAssetWriter, and it is perfectly working on iOS6. The problem is, when I called finishWritingWithCompletionHandler , the completion handler is not called on iOS7 GM. I called markAsFinished , and even endSessionAtSourceTime before I call finishWritingWithCompletionHandler. It works fine on iOS6. And even more, on iOS7, it works some times, and then it doesn't work again. I don't know why, but it works if I call the method using alert view. So I tried performSelectorOnMainThread and inBackground , but it didn't help. Any ideas? Apparently you need to retain the assetWriter now. You

Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

穿精又带淫゛_ 提交于 2019-11-30 05:25:03
I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video. I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the video buffers, I get the audio buffers without problems. This is the code I'm using: -(void

AVAssetWriterInput H.264 Passthrough to QuickTime (.mov) - Passing in SPS/PPS to create avcC atom?

杀马特。学长 韩版系。学妹 提交于 2019-11-30 04:07:19
I have a stream of H.264/AVC NALs consisting of types 1 (P frame), 5 (I frame), 7 (SPS), and 8 (PPS). I want to write them into an .mov file without re-encoding. I'm attempting to use AVAssetWriter to do this. The documentation for AVAssetWriterInput states: Passing nil for outputSettings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, passthrough is currently supported only when writing to QuickTime Movie

how to record screen video as like Talking Tomcat application does in iphone?

こ雲淡風輕ζ 提交于 2019-11-30 04:05:37
hey i m trying the record the gameplay of my game so that i can upload its video to youtube from device itself...m trying to do same thing as Talking tomcat app for iphone..recording the video then playing it ,etc... i m using glReadPixels() for reading framebuffer data and then writing it to video with the help of AVAssetWriter in AVFoundation framwork. But reading the data on each drawing decreases the FPS from around 30-35 to 2-3 only while using glReadPixels. i think Talking tomcat is also made with the help of Opengl ES it also has the video recording facility but it doesnot slows down

How would I put together a video using the AVAssetWriter in swift?

时光毁灭记忆、已成空白 提交于 2019-11-30 04:03:02
I'm currently making a small app that timelapses the webcam on my mac, saves the captured frame to png, and I am looking into exporting the captured frames as a single video. I use CGImage to handle the original images and have them set in an array but I'm unsure on there to go from there. I gather from my own research that I have to use AVAssetWriter and AVAssetWriterInput somehow. I've had a look about on here, read the apple docs and searched google. But all the guides etc, are in obj-c rather than swift which is making it really difficult to understand (As I have no experience in Obj-C).

How to merge video and audio files?

£可爱£侵袭症+ 提交于 2019-11-29 23:06:26
问题 My question is: How to merge video and audio files that has almost the same duration? I searched and got some answers to this question. However when I try the code they gave, it just does not produce a "non-zero byte" movie. Could you take a look at it and see where it went wrong? -(void)putTogether { NSLog(@"Starting to put together all the files!"); AVMutableComposition *mixComposition = [AVMutableComposition composition]; NSString *audioPath = @"/Users/admin/Documents/Sound.caf"; NSURL

Data corruption when reading realtime H.264 output from AVAssetWriter

大兔子大兔子 提交于 2019-11-29 20:02:09
I'm using some tricks to try to read the raw output of an AVAssetWriter while it is being written to disk. When I reassemble the individual files by concatenating them, the resulting file is the same exact number of bytes as the AVAssetWriter's output file. However, the reassembled file will not play in QuickTime or be parsed by FFmpeg because there is data corruption. A few bytes here and there have been changed, rendering the resulting file unusable. I assume this is occurring on the EOF boundary of each read, but it isn't consistent corruption. I plan to eventually use code similar to this

Merging Audio with Video Objective-C

泪湿孤枕 提交于 2019-11-29 15:46:10
问题 I'm attempting to merge an audio file with a video file using: + (void)CompileFilesToMakeMovie:(NSString *) audioPath { NSLog(@"a"); AVMutableComposition* mixComposition = [AVMutableComposition composition]; NSURL *audio_inputFileUrl = [[NSURL alloc] initFileURLWithPath:audioPath]; NSString *videoPath = [[NSBundle bundleForClass:[self class]] pathForResource:@"input" ofType:@"mov"]; NSURL* video_inputFileUrl = [NSURL fileURLWithPath:videoPath]; NSArray *dirPaths =

How to convert AudioBufferList to CMSampleBuffer?

六眼飞鱼酱① 提交于 2019-11-29 08:13:25
I have an AudioTapProcessor attached to AVPlayerItem. which will call static void tap_ProcessCallback(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) when processing. I need to convert the AudioBufferList to CMSampleBuffer so I could use AVAssetWriterAudioInput.appendSampleBuffer to write it into a movie file. So how to convert AudioBufferList to CMSampleBuffer ? I tried this but got -12731 error:Error cCMSampleBufferSetDataBufferFromAudioBufferList

Changing AVCaptureDeviceInput leads to AVAssetWriterStatusFailed

北城余情 提交于 2019-11-29 07:00:39
I am trying to change the Camera View Front and Back .it is working well.if video is recorded without flipping with Pause/Record Option it is working fine.But if we Flip Camera View once then, further recording video is not saving which leads to AVAssetWriterStatusFailed - The operation could not be completed . Can anybody help me to find where i have gone wrong ? Below is my code. Camera.m - (void)flipCamera{ NSArray * inputs = _session.inputs; for ( AVCaptureDeviceInput * INPUT in inputs ) { AVCaptureDevice * Device = INPUT.device ; if ( [ Device hasMediaType : AVMediaTypeVideo ] ) {