avcapturesession

Modify AVCaptureSession before saving with AVCaptureMovieFileOutput

眉间皱痕 提交于 2019-12-04 14:00:16
问题 Use Case: I want to capture input from the camera, draw on top of the captured frames (and sound) and save the result as a .mov file. I see that I can capture input for the camera using AVCaptureSession. I can save this to a .mov file using AVCaptureMovieFileOutput. AVVideoComposition can be used to add Core Animation for playback. I assume for recording somehow too? Problem: I can't see how to modify the input before it is saved to file. 回答1: The RosyWriter was almost doing what I wanted.

AVCaptureSession returning blank image on iPhone 3G only

≡放荡痞女 提交于 2019-12-04 11:44:58
I'm using Apple's example code for AVCaptureSession, and the UIImage that gets created is completely blank. This only happens on the iPhone 3G, along with a unique error that shows up on console that says - Error: CGDataProviderCreateWithCopyOfData: vm_copy failed: status 2. I've researched the error online and found this StackOverflow answer , and it gets rid of the error...but the image is still blank. Has anyone else experienced this and know how to fix it? Thanks in advance. My Code - CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress

iOS Swift 2 Record Video AVCaptureSession

回眸只為那壹抹淺笑 提交于 2019-12-04 10:03:51
I created an AVCaptureSession and attached to it the front facing camera do { try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice)) }catch{print("err")} Now I want to start and stop recording on touche events. How do I do this? override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) { print("touch") //Start Recording } override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) { print("release"); //End Recording and Save } You didn't mention if you're using AVCaptureMovieFileOutput or AVCaptureVideoDataOutput as an output for your

Exporting AVCaptureSession video in a size that matches the preview layer

余生颓废 提交于 2019-12-04 09:34:34
问题 I'm recording video using AVCaptureSession with the session preset AVCaptureSessionPreset640x480 . I'm using an AVCaptureVideoPreviewLayer in a non-standard size (300 x 300) with the gravity set to aspect fill while recording. It's setup like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession]; _previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; _previewLayer.frame = _previewView.bounds; // 300 x 300 [_previewView.layer addSublayer:

iOS Custom Keyboard - camera not working

南楼画角 提交于 2019-12-04 09:21:48
问题 I want to create a custom keyboard, that acts as a barcode scanner. I already did the whole coding, but the output is not as expected: I am being asked for camera permissions (the first time), but the camera sends no video to the view. I think, that there might be some restrictions using keyboards for safety reasons?!? 1.) Turn on the torch -(void) turnFlashOn { AVCaptureDevice *flashLight = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if([flashLight isTorchAvailable] &&

AVCaptureOutput didOutputSampleBuffer stops getting called

限于喜欢 提交于 2019-12-04 08:39:42
I have an issue with the delegate method didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection of AVCaptureOutput . It stops getting called within a second or two when I'm adding the sampleBuffer to a CFArray . If I remove the CFArray code, the delegate method continues to get called so I have no idea why the CFArray code is causing it to stop. I'd appreciate any help. @property CFMutableArrayRef sampleBufferArray; - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:

ios/iphone photo burst mode api

一世执手 提交于 2019-12-04 08:39:36
问题 I'm trying to capture multiple photos on highest resolution(AVCaptureSessionPresetPhoto) on iPhone 5s. I tried using the following code: dispatch_semaphore_t sync = dispatch_semaphore_create(0); while( [self isBurstModeEnabled] == YES ) { [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (imageSampleBuffer != NULL) { NSData *imageData = [AVCaptureStillImageOutput

How to generate an UIImage from AVCapturePhoto with correct orientation?

浪子不回头ぞ 提交于 2019-12-04 08:37:44
问题 I am calling AVFoundation 's delegate method to handle a photo capture, but I am having difficulty converting the AVCapturePhoto it generates into an UIImage with the correct orientation. Although the routine below is successful, I always get a right-oriented UIImage ( UIImage.imageOrientation = 3). I have no way of providing an orientation when using the UIImage(data: image) and attempting to first use photo.cgImageRepresentation()?.takeRetainedValue() also doesn't help. Please assist. Image

Swift AVCaptureSession Close Open Button Error : Multiple audio/video AVCaptureInputs are not currently supported

别等时光非礼了梦想. 提交于 2019-12-04 06:05:53
I have a working barcode scanner code. When I click the openCamera button, first time everything is good. When I click the closeCamera button, good, but if I click again the openCamera button gives a fatal error. Code and error are below. In fact, is it possible to toggle camera view with one button? // Barcode Camera Properties let captureSession = AVCaptureSession() var captureDevice:AVCaptureDevice? var captureLayer:AVCaptureVideoPreviewLayer? override func viewDidLoad() { super.viewDidLoad() self.cameraView.alpha = 0 } @IBAction func closeCamera(sender: AnyObject) { self.captureLayer!

AVCaptureSession for audio in simulator

孤人 提交于 2019-12-04 04:11:45
I'm trying to capture audio, using the method in this question ; with AVCaptureSession and AVCaptureAudioDataOutput. This seems to work fine with 1 inconvenience: it doesn't work in the simulator. Both AVAudioRecorder, and the good old SpeakHere demo app, work fine in the simulator, using the internal microphone on my MacBook Pro. Problem is that [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] gives null in the simulator, so subsequent code fails with the message (when it tries to add null as input to the AVCaptureSession): *** Terminating app due to uncaught exception