avcapturesession

Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

旧时模样 提交于 2020-01-10 09:14:04
问题 I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video. I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the

Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

孤人 提交于 2020-01-10 09:13:49
问题 I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video. I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the

Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

一曲冷凌霜 提交于 2020-01-10 09:13:10
问题 I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video. I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the

AVCaptureSession vs. UIImagePickerController speeds

拈花ヽ惹草 提交于 2020-01-06 15:09:59
问题 At the moment I am using a custom overlay on a UIImagePickerController , calling takePicture() to capture images. However, it takes a good few seconds to call the delegate method didFinishPickingMediaWithInfo . I've heard about using AVCaptureSession for my control over the camera. Would this allow me to get faster picture-taking speeds (similar to that of Snapchat)? Or are there any other ways that I can Thanks EDIT I'm implementing my image capture as follows. First I initialise a

How do I implement camera changing from front to back camera

ε祈祈猫儿з 提交于 2020-01-06 06:37:33
问题 Below is the code for the camera section, I tried adding in a boolean to detect when the front camera is activated but I receive an error. import UIKit import AVFoundation class MainCameraCollectionViewCell: UICollectionViewCell { @IBOutlet weak var myView: UIView! var captureSession = AVCaptureSession() var backCamera: AVCaptureDevice? var frontCamera: AVCaptureDevice? var currentCamera: AVCaptureDevice? var photoOutPut: AVCapturePhotoOutput? var cameraPreviewLayer:

How do I implement camera changing from front to back camera

拟墨画扇 提交于 2020-01-06 06:36:27
问题 Below is the code for the camera section, I tried adding in a boolean to detect when the front camera is activated but I receive an error. import UIKit import AVFoundation class MainCameraCollectionViewCell: UICollectionViewCell { @IBOutlet weak var myView: UIView! var captureSession = AVCaptureSession() var backCamera: AVCaptureDevice? var frontCamera: AVCaptureDevice? var currentCamera: AVCaptureDevice? var photoOutPut: AVCapturePhotoOutput? var cameraPreviewLayer:

AVCam save full screen captured image

回眸只為那壹抹淺笑 提交于 2020-01-04 13:10:14
问题 I am using AVCam made by apple for my custom camera view. Honestly it is not to simple to understand what's going on in the class AVCamViewController if you see it at first time. Right now I am interested how they set frame of captured image. I tried to found where some fames setters or something like this, but I have not found any. I searched in Google and found answer here AVCam not in fullscreen But when I implemented that solution I just realised that it just made live camera preview

Pictures taken with the camera come out really dark on iOS w/ Swift

僤鯓⒐⒋嵵緔 提交于 2020-01-02 02:17:08
问题 The camera preview looks perfect, but when I take a picture and save it to the Camera Roll the picture comes out extremely dark. Tried a bunch of threads on here but nothing solved it. Here's my code. This sets up the camera/preview and works fine: captureSession.sessionPreset = AVCaptureSessionPreset640x480 let devices = AVCaptureDevice.devices() // Loop through all the capture devices on this phone for device in devices { // Make sure this particular device supports video if (device

Most efficient/realtime way to get pixel values from iOS camera feed in Swift

江枫思渺然 提交于 2020-01-01 19:22:21
问题 There are some discussions on here about similar questions. Like this, but they seem quite outdated, so I thought I'd ask here. I want to get near-realtime RGB pixel values, or even better, a full image RGB histogram from a camera feed in swift 2.0. I want this to be as quick and up to date as possible (~30 fps or higher ideally) Can I get this directly from a AVCaptureVideoPreviewLayer or do I need to capture each frame (async, I assume, if the process takes significant time) then extract

Capture current camera image using AVFoundation

大城市里の小女人 提交于 2020-01-01 05:47:08
问题 I'm trying to capture the image and save it to a variable when I press "myButton". What should I do? My code is as follows: import UIKit import AVFoundation import MobileCoreServices class ViewController: UIViewController { let captureSession = AVCaptureSession() var previewLayer : AVCaptureVideoPreviewLayer? var captureDevice : AVCaptureDevice? @IBOutlet var myTap: UITapGestureRecognizer! @IBOutlet weak var myButton: UIButton! @IBAction func shotPress(sender: UIButton) { //Save image to