avfoundation

iOS: Synchronizing frames from camera and motion data

為{幸葍}努か 提交于 2019-12-04 09:03:39
I'm trying to capture frames from camera and associated motion data. For synchronization I'm using timestamps. Video and motion is written to a file and then processed. In that process I can calculate motion-frames offset for every video. Turns out motion data and video data for same timestamp is offset from each other by different time from 0.2 sec up to 0.3 sec. This offset is constant for one video but varies from video to video. If it was same offset every time I would be able to subtract some calibrated value but it's not. Is there a good way to synchronize timestamps? Maybe I'm not

Slow presentViewController performance

♀尐吖头ヾ 提交于 2019-12-04 08:59:31
I am using UIViewControllerTransitioningDelegate to build custom transitions between two view controllers (from a MKMapView ) to a custom Camera built on ( AVFoundation ). Everything goes well until I call the presentViewController and the phone seems to hang for about 1 second (when I log everything out). This even seems to happen when I am transitioning to a much simpler view (I have a view controller that only displays a UITextview and even with that there appears to be about a .4 - .5 second delay before the transition is actually called). This is currently how I am calling the transition

AVAudioConverter is broken in iOS 10

房东的猫 提交于 2019-12-04 08:57:28
AVAudioConverter seems broken in iOS 10. The code was working in iOS 9 and now Error Domain=NSOSStatusErrorDomain Code=-50 "(null)" is returned no matter what audio format is used. It suprises me every year, that basic library functionality stops working. func audioConverterFailureIOS10() { // Describe the audio format let inFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2) let outFormat = AVAudioFormat(standardFormatWithSampleRate: 22050, channels: 2) // Allocate buffers let outBuffer = AVAudioPCMBuffer(pcmFormat: outFormat, frameCapacity: 1024) // Create an input block

Why does capturing images with AVFoundation give me 480x640 images when the preset is 640x480?

冷暖自知 提交于 2019-12-04 08:51:46
问题 I have some pretty basic code to capture a still image using AVFoundation. AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil]; AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [newStillImageOutput setOutputSettings:outputSettings]; [outputSettings release];

iOS5 AVFoundation image to video

≯℡__Kan透↙ 提交于 2019-12-04 08:45:13
问题 I'm trying to create a video from a single image, and save it to my photos library, I've been googling around for ages - and cannot find a solution. I have this code: @autoreleasepool { NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/movie2.mp4"]]; UIImage *img = [UIImage imageWithData:[[self imageDataArrya]objectAtIndex:0]imageData]; [self writeImageAsMovie:img toPath:path size:CGSizeMake(640, 960) duration:10];

Create a copy of CMSampleBuffer in Swift 2.0

微笑、不失礼 提交于 2019-12-04 08:42:59
This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After some experimentation I discovered that AVCaptureSession must be reusing its CMSampleBuffer references. When I try to keep more than 15 the session hangs. So I thought I would make copies of the sample buffers. But I can't seem to get it to work. Here is what I have written: var allocator: Unmanaged<CFAllocator>! = CFAllocatorGetDefault() var bufferCopy: UnsafeMutablePointer<CMSampleBuffer?> let err =

AVCaptureOutput didOutputSampleBuffer stops getting called

限于喜欢 提交于 2019-12-04 08:39:42
I have an issue with the delegate method didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection of AVCaptureOutput . It stops getting called within a second or two when I'm adding the sampleBuffer to a CFArray . If I remove the CFArray code, the delegate method continues to get called so I have no idea why the CFArray code is causing it to stop. I'd appreciate any help. @property CFMutableArrayRef sampleBufferArray; - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:

ios/iphone photo burst mode api

一世执手 提交于 2019-12-04 08:39:36
问题 I'm trying to capture multiple photos on highest resolution(AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code: dispatch_semaphore_t sync = dispatch_semaphore_create(0); while( [self isBurstModeEnabled] == YES ) { [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (imageSampleBuffer != NULL) { NSData *imageData = [AVCaptureStillImageOutput

iOS Extracting Audio from .mov file

风流意气都作罢 提交于 2019-12-04 08:38:31
问题 I've been trying to extract audio from a .mov file for a while now and I just can't seem to get it working. Specifically, I need to extract the audio and save it as an .aif or .aiff file . I've tried using an AVMutableComposition, and loading the mov file as a AVAsset. Adding only the audio track to the AVMutableComposition before finally using an AVAssetExportSession (setting the output file type to AVFileTypeAIFF, which is the format I need it in), to write the file to an aif. I get an

iOS swift convert mp3 to aac

泪湿孤枕 提交于 2019-12-04 08:38:16
I'm converting an mp3 to m4a in Swift with code based on this . It works when I generate a PCM file. When I change the export format to m4a it generates a file but it won't play. Why is it corrupt? Here is the code so far: import AVFoundation import UIKit class ViewController: UIViewController { var rwAudioSerializationQueue:dispatch_queue_t! var asset:AVAsset! var assetReader:AVAssetReader! var assetReaderAudioOutput:AVAssetReaderTrackOutput! var assetWriter:AVAssetWriter! var assetWriterAudioInput:AVAssetWriterInput! var outputURL:NSURL! override func viewDidLoad() { super.viewDidLoad() let