avfoundation

Camera differences between UIImagePickerController and AVCaptureSession on iPhone

[亡魂溺海] 提交于 2019-11-30 12:33:48
问题 I'm trying to build a replacement for UIImagePickerController , using AVCaptureSession with AVCaptureDeviceInput and AVCaptureStillImageOutput , as input/output respectively. To preview the camera stream I'm using AVCaptureVideoPreviewLayer . It's now working correctly for capturing and storing photos just like the default camera. However, I found 3 problems I was unable to solve: photos captured don't get the same quality the default camera provides the viewing/capture angle is shortened,

How can I extract an AVMetadataObject from a UIImage?

随声附和 提交于 2019-11-30 12:13:22
问题 I'd like to use iOS 7's new barcode scanning functionality with a UIImage instead of live capture from one of the device's camera. I already have the detection working fine with an AVCaptureDeviceInput . The best way I think to do this would be to create a concrete subclass of AVCaptureInput that provides media data to an AVCaptureSession from a UIImage . However, I can't find any documentation or examples on how to subclass AVCaptureInput , so I'm at a loss. An alternative would be to

Tap Mic Input Using AVAudioEngine in Swift

这一生的挚爱 提交于 2019-11-30 10:53:43
问题 I'm really excited about the new AVAudioEngine. It seems like a good API wrapper around audio unit. Unfortunately the documentation is so far nonexistent, and I'm having problems getting a simple graph to work. Using the following simple code to set up an audio engine graph, the tap block is never called. It mimics some of the sample code floating around the web, though those also did not work. let inputNode = audioEngine.inputNode var error: NSError? let bus = 0 inputNode.installTapOnBus(bus

ios AVFoundation tap to focus

…衆ロ難τιáo~ 提交于 2019-11-30 10:52:11
问题 I am trying to create a camera app which, would act like the default camera app more or less. The thing, which is not working for me at the moment, is tap to focus. I want the camera to focus and do whatever it does on my touched point, just like the real camera app does. Here's my viewDidLoad - (void)viewDidLoad { [super viewDidLoad]; // Session _session = [[AVCaptureSession alloc] init]; _session.sessionPreset = AVCaptureSessionPresetPhoto; // Input _videoDevice = [AVCaptureDevice

Problem in writing metadata to image

孤者浪人 提交于 2019-11-30 10:48:05
I am using AvFoundation to take still image and adding gps info to metadata and saving to a photo album using Asset library but gps info is not saving at all. here is my code... [self.stillImageTaker captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer != NULL) { CFDictionaryRef exifAttachments = CMGetAttachment(imageDataSampleBuffer,kCGImagePropertyExifDictionary, NULL); CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL, imageDataSampleBuffer, kCMAttachmentMode

How to overlay one video on another in iOS?

房东的猫 提交于 2019-11-30 10:33:39
I am trying to crop an already taken video into a circle in iOS. How might I go about doing this. I know how I would do it with AVCaptureSession but I don't know to pass in an already taken video as an AVCaptureDevice? Is there a way to crop a video into a circle. I want to overlay it on top of another video so it has to have a transparent background as well. Thanks. I guess you want to produce something like this: You don't want an AVCaptureSession , because you're not capturing video. You want an AVMutableComposition . You need to read the “Editing” section of the AV Foundation Programming

Save sampleBuffer in array (AVFoundation)

扶醉桌前 提交于 2019-11-30 10:25:22
I try to save the sample buffer instead of an UIImage to an array, to convert it later on. This to speed up the image capturing and maybe not get memory warnings. I just can't figure out how to save it to the array and then use it again to call [self imageFromSampleBuffer:sampleBuffer]. I tried something like this, but how do I convert the data back to a CMSampleBufferRef object? - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Create a UIImage from the sample buffer data //

How do I set the orientation for a frame-by-frame-generated video using AVFoundation?

左心房为你撑大大i 提交于 2019-11-30 10:09:08
I am writing an iPhone app which takes video from the camera, runs it through some OpenGL shader code and then writes the output to a video file using AVFoundation . The app runs in lanscape orientation (either) and therefore all video recorded should be landscape. The current code I use before starting recording to get the video the right way round is: [[self videoWriterInput] setTransform:CGAffineTransformScale(CGAffineTransformMakeRotation(M_PI), -1.0, 1.0)]; where videoWriterInput is an instance of AVAssetWriterInput and the aim is to compensate for the landscape mode and the reveresed

How to take UIImage of AVCaptureVideoPreviewLayer instead of AVCapturePhotoOutput capture

送分小仙女□ 提交于 2019-11-30 09:59:36
I want to "stream" the preview layer to my server, however, I only want specific frames to be sent. Basically, I want to take a snapshot of the AVCaptureVideoPreviewLayer, scale it down to 28*28, turn it into an intensity array, and send it to my socket layer where my python backend handles the rest. Problem here is that AVCapturePhotoOutput's capture function is insanely slow. I can't repeatedly call the function. Not to mention it always makes a camera shutter sound haha. The other problem is that taking a snapshot of AVCaptureVideoPreviewLayer is really difficult. Using

Setting AVMutableComposition's frameDuration

我与影子孤独终老i 提交于 2019-11-30 09:54:45
I'm playing with the AVEditDemo project, from Apple's WWDC 2010 sample pack, and I'm trying to change the frame rate of the exported video. The video is exported using an AVMutableComposition on which the frameDuration is set like that: videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps For some reason, changing the 30 to 25 does not change the framerate of the video exported with the AVAssetExportSession . Does anyone have an idea why? MonsieurDart It seems that the AVAssetExportSession preset takes precedence over the AVVideoComposition frameDuration . I've opened a bug report: