avcapturesession

How to generate an UIImage from AVCapturePhoto with correct orientation?

回眸只為那壹抹淺笑 提交于 2019-12-02 23:03:07
I am calling AVFoundation 's delegate method to handle a photo capture, but I am having difficulty converting the AVCapturePhoto it generates into an UIImage with the correct orientation. Although the routine below is successful, I always get a right-oriented UIImage ( UIImage.imageOrientation = 3). I have no way of providing an orientation when using the UIImage(data: image) and attempting to first use photo.cgImageRepresentation()?.takeRetainedValue() also doesn't help. Please assist. Image orientation is critical here as the resulting image is being fed to a Vision Framework workflow. func

Show camera stream while AVCaptureSession's running

℡╲_俬逩灬. 提交于 2019-12-02 19:39:14
I was able to capture video frames from the camera using AVCaptureSession according to http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html . However, it seems that AVCaptureScreen captures frames from the camera without showing the camera stream on the screen. I would like to also show camera stream just like in UIImagePicker so that the user knows that the camera is being turned on and sees what the camera is pointed at. Any help or pointer would be appreciated! AVCaptureVideoPreviewLayer is exactly what you're looking for. The code fragment Apple uses to demonstrate how to use it

How to use AVCaptureSession with Slide Over and Split View in iOS 9?

喜欢而已 提交于 2019-12-02 19:10:55
My team is developing a set of SDKs for barcode scanning , ID scanning and OCR . We use device's camera, specifically, AVCaptureSession , to obtain video frames on which we perform our processing. We're exploring new iOS 9 multitasking features Slide Over and Split View. Apple suggests opting out of these features for camera-centric apps, where using the entire screen for preview and capturing a moment quickly is a primary feature ( reference ). This is the approach the use in their sample app AVCam . However, our customers might have apps which don't fall into this category (e.g Mobile

AVCaptureDevice Camera Zoom

血红的双手。 提交于 2019-12-02 18:40:00
I have a simple AVCaptureSession running to get a camera feed in my app and take photos. How can I implement the 'pinch to zoom' functionality using a UIGestureRecognizer for the camera? Gabriel Cartier The accepted answer is actually outdated and I'm not sure it will actually take the photo of the zoomed in image. There is a method to zoom in like bcattle answer says. The problem of his answer is that it does not take in charge the fact that the user can zoom in and then restart from that zoom position. His solution will create some kind of jumps that are not really elegant. The easiest and

AVCapture appendSampleBuffer

和自甴很熟 提交于 2019-12-02 18:37:21
I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of. Am making an iPhone app that uses AVFoundation - specifically AVCapture to capture video using the iPhone camera. I need to have a custom image that is overlayed on the video feed included in the recording. So far I have the AVCapture session set up, can display the feed, access the frame, save it as a UIImage and marge the overlay Image onto it. Then convert this new UIImage into a CVPixelBufferRef. annnd to double check that the bufferRef is working I converted it back to a UIImage and

Cropping a captured image exactly to how it looks in AVCaptureVideoPreviewLayer

感情迁移 提交于 2019-12-02 18:25:56
I have a photo app that is using AV Foundation. I have setup a preview layer using AVCaptureVideoPreviewLayer that takes up the top half of the screen. So when the user is trying to take their photo, all they can see is what the top half of the screen sees. This works great, but when the user actually takes the photo and I try to set the photo as the layer's contents, the image is distorted. I did research and realized that I would need to crop the image. All I want to do is crop the full captured image so that all that is left is exactly what the user could originally see in the top half of

Can I use AVCaptureSession to encode an AAC stream to memory?

只谈情不闲聊 提交于 2019-12-02 17:21:31
I'm writing an iOS app that streams video and audio over the network. I am using AVCaptureSession to grab raw video frames using AVCaptureVideoDataOutput and encode them in software using x264 . This works great. I wanted to do the same for audio, only that I don't need that much control on the audio side so I wanted to use the built in hardware encoder to produce an AAC stream. This meant using Audio Converter from the Audio Toolbox layer. In order to do so I put in a handler for AVCaptudeAudioDataOutput 's audio frames: - (void)captureOutput:(AVCaptureOutput *)captureOutput

How to crop an image from AVCapture to a rect seen on the display

前提是你 提交于 2019-12-02 17:16:37
This is driving me crazy because I can't get it to work. I have the following scenario: I'm using an AVCaptureSession and an AVCaptureVideoPreviewLayer to create my own camera interface. The interface shows a rectangle. Below is the AVCaptureVideoPreviewLayer that fills the whole screen. I want to the captured image to be cropped in a way, that the resulting image shows exactly the content seen in the rect on the display. My setup looks like this: _session = [[AVCaptureSession alloc] init]; AVCaptureSession *session = _session; session.sessionPreset = AVCaptureSessionPresetPhoto;

Getting only white screenshot

北城以北 提交于 2019-12-02 11:39:10
问题 I can read the barcode but I can't get the snapshot of the screen. getScreenImage function gets a white screen. How can I get the screenshot including the screen which I see the camera view? Thank you. @interface igViewController () <AVCaptureMetadataOutputObjectsDelegate,AVCaptureVideoDataOutputSampleBufferDelegate> { AVCaptureSession *_session; AVCaptureDevice *_device; AVCaptureDeviceInput *_input; AVCaptureMetadataOutput *_output; AVCaptureVideoPreviewLayer *_prevLayer; UIView *

Can't use AVCaptureDevice with a flash

二次信任 提交于 2019-12-02 05:43:52
问题 I am having difficult times, for something which I think ought to be simple. I just want to light the flash when taking a picture in my iOS app. And all I tried failed or works only 20 percent. Here is the code fired to light the flash up: // Here we have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On) do {try captureDevice.lockForConfiguration() captureDevice.flashMode = .On captureDevice.unlockForConfiguration() } catch let error as NSError { print("captureDevice