avfoundation

AVFoundation - Retiming CMSampleBufferRef Video Output

坚强是说给别人听的谎言 提交于 2019-12-03 01:32:31
问题 First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly. I'm experimenting with AVFoundation and time lapse photography. My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput. The problem is, if I use the CMSampleBufferRef passed to captureOutput

How to record a video with avfoundation in Swift?

橙三吉。 提交于 2019-12-03 01:31:06
问题 I am trying to figure out how to record a video using AVFoundation in Swift. I have got as far as creating a custom camera but I only figured out how to take still pictures with it and I can't figure out how to record video. From what I understand you have to use AVCaptureVideoDataOutput to get the data from the recording but I can't figure out how to start the recording and implement the delegate methods. The whole AVFoundation Programing Guide/Still and Video Media Capture is in Objective-C

ios/iphone photo burst mode api

南楼画角 提交于 2019-12-03 01:28:06
I'm trying to capture multiple photos on highest resolution(AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code: dispatch_semaphore_t sync = dispatch_semaphore_create(0); while( [self isBurstModeEnabled] == YES ) { [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (imageSampleBuffer != NULL) { NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; NSString *videoThumbPath = [NSString stringWithFormat:@"%@/img%d.png",

Creating CMSampleBufferRef from the data

天大地大妈咪最大 提交于 2019-12-03 01:27:09
I am trying to create a CMSampleBuffer Ref from the data and trying to feed it to AVAssetWriter. But asset writer is failing to create the movie from the data. Following is the code to create the CMSampleBufferRef. CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(cvimgRef,0); uint8_t *buf=(uint8_t *)CVPixelBufferGetBaseAddress(cvimgRef); int width = 480; int height = 360; int bitmapBytesPerRow = width*4; int bitmapByteCount = bitmapBytesPerRow*height; CVPixelBufferRef pixelBufRef = NULL; CMSampleBufferRef newSampleBuffer = NULL;

iOS alternative to QTMovieLayer that has non-nil `contents`?

匆匆过客 提交于 2019-12-03 01:20:15
Background QTKit (QuickTime Kit) is a Mac framework from the 10.3 days that got some layer additions in 10.5 like for example QTMovieLayer . One of the nice things with QTMovieLayer is that you can access the movie content using the regular content property on the layer and get a CAImageQueue object back. The nice thing with this is that you can create a bunch of regular CALayer s and set the image queue as their content and give all the layers their own part of the movie by setting the correct contentRect . This means that you can create something like the image below with only one movie

AVFoundation tap to focus feedback rectangle

Deadly 提交于 2019-12-03 01:18:09
问题 I am developing an iphone application where I directly use AVFoundation to capture videos via the camera. I've implemented a feature to enable the tap to focus function for a user. - (void) focus:(CGPoint) aPoint; { #if HAS_AVFF Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice"); if (captureDeviceClass != nil) { AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo]; if([device isFocusPointOfInterestSupported] && [device isFocusModeSupported

What's the difference between all these audio frameworks?

心已入冬 提交于 2019-12-03 01:02:47
问题 In the documentation I see several frameworks for audio. All of them seem to be targeted at playing and recording audio. So I wonder what the big differences are between these? Audio Toolbox, Audio Unit, AV Foundation, and Core Audio. Or did I miss a guide that gives a good overview of all these? 回答1: Core Audio is the lowest-level of all the frameworks and also the oldest. Audio Toolbox is just above Core Audio and provides many different APIs that make it easier to deal with sound but still

Adding text subtitles to video track (in Swift) fails with error code -11841

为君一笑 提交于 2019-12-03 00:50:40
I have been struggling with adding text subtitles to videos for a while. I have added some links that I referred in detail, but they are not helping. In below code, I am trying to add a subtitle to a video. The output file path is as below: file:///var/mobile/Applications/03E49B29-1070-4541-B7CB-B1366732C179/Documents/output_movie.mov In addition, the input file was recorded with a call to UIPickerView in the same application, at temporary path below: file:///private/var/mobile/Applications/03E49B29-1070-4541-B7CB-B1366732C179/tmp/capture/capturedvideo.MOV The error I am getting is as follows,

Capturing video while processing it through a shader on iPhone

不羁的心 提交于 2019-12-03 00:32:24
I am trying to develop an iPhone app that processes/filters and records video. I have two sample apps that have aspects of what I need and am trying to combine them. AVCamDemo from the WWDC10 sample code package (Apple Developer ID required) This deals with capturing/recording video. Brad Larson's ColorTracking sample app referenced here This deals with live processing of video using OpenGL ES I get stuck when trying to combine the two. What I have been trying to do is to use AVCaptureVideoOutput and the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video frames through

AVAssetWriter multiple sessions and the status property

时间秒杀一切 提交于 2019-12-03 00:24:52
I am attempting to create multiple, serial writing sessions with AVAssetWriter. Once I've completed one successfully (after calling finishWriting) the status is set to 2 (AVAssetWriterStatusCompleted). Trying to create another session, I call startWriting, but I get the error: [AVAssetWriter startWriting] cannot call method when status is 2 Seems I cannot create a writing session unless I configure something. Do I have to recreate the AVAssetWriter again? I must be missing something, and the docs aren't helping. Thanks. After the writer has completed it is no longer usable. You must create a