avcapturesession

Method to find device's camera resolution iOS

落花浮王杯 提交于 2019-11-26 19:03:30
问题 Whats the best method to find the image resolution going to be captured using setting AVCaptureSessionPresetPhoto . I am trying to find the resolution before capturing the image. 回答1: With the function below, you can programmatically get the resolution from activeFormat before capture begins, though not before adding inputs and outputs: https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureDevice_Class/index.html#//apple_ref/occ/instp/AVCaptureDevice

Barcode on swift 4

和自甴很熟 提交于 2019-11-26 18:51:45
I'm trying to upgrade mi app to swift 4, but the barcode reader is not working. I have isolated the barcode reader code, and still not working. The camera works but it does not detect the barcode. The code worked just fine on swift 3 iOS 10. This is the complete code import AVFoundation import UIKit class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate { var captureSession: AVCaptureSession! var previewLayer: AVCaptureVideoPreviewLayer! override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = UIColor.black captureSession = AVCaptureSession() let

AVCaptureSession audio doesn't work for long videos

情到浓时终转凉″ 提交于 2019-11-26 17:47:12
问题 I'm using AVCaptureSession to record a video with audio. Everything seems to work properly for short videos, but for some reason, if I record a video that is longer than about 12 seconds, the audio doesn't work. 回答1: We also experienced this issue. Basically disabling movie fragment writing will work but it doesn't actually explain the issue. Most likely you are recording to an output file using a file extension that does not support this feature, like mp4 . If you pass an output file with

AVCaptureVideoPreviewLayer orientation - need landscape

て烟熏妆下的殇ゞ 提交于 2019-11-26 15:25:39
问题 My app is landscape only. I'm presenting the AVCaptureVideoPreviewLayer like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [self.previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; NSLog(@"previewView: %@", self.previewView); CALayer *rootLayer = [self.previewView layer]; [rootLayer setMasksToBounds:YES]; [self.previewLayer setFrame:[rootLayer bounds]]; NSLog(@

AVCaptureSession and background audio iOS 7

廉价感情. 提交于 2019-11-26 09:37:43
问题 Whenever I start an AVCaptureSession running with the microphone as an input it cancels whatever background music is currently running (iPod music for instance). If I comment out the line adding the audio input, the background audio continues. Does anyone know a way to record video clips with the microphone while continuing to allow background audio to play? Also there is error, when you trying to record video and the music is currently playing. A tried to do like this: [[AVAudioSession

Switch cameras with avcapturesession

我是研究僧i 提交于 2019-11-26 09:29:41
问题 Using this tutorial here: http://www.musicalgeometry.com/?p=1297 I have created a custom overlay and image capture with AVCaptureSession . I am attempting to allow the user to switch between the front and back camera. Here is my code in CaptureSessionManager to switch cameras: - (void)addVideoInputFrontCamera:(BOOL)front { NSArray *devices = [AVCaptureDevice devices]; AVCaptureDevice *frontCamera; AVCaptureDevice *backCamera; for (AVCaptureDevice *device in devices) { //NSLog(@\"Device name:

Trying to understand CMTime

我只是一个虾纸丫 提交于 2019-11-26 09:21:06
问题 I have seen some examples of CMTime (Three separate links), but I still don\'t get it. I\'m using an AVCaptureSession with AVCaptureVideoDataOutput and I want to set the max and min frame rate of the the output. My problem is I just don\'t understand the CMTime struct. Apparently CMTimeMake(value, timeScale) should give me value frames every 1/timeScale seconds for a total of value/timeScale seconds, or am I getting that wrong? Why isn\'t this documented anywhere in order to explain what this

Avoiding blurriness at start & end of video (even after using setPreferredVideoStabilizationMode:AVCaptureVideoStabilizationModeAuto)?

旧城冷巷雨未停 提交于 2019-11-26 08:37:17
问题 We capture video on iOS while using setPreferredVideoStabilizationMode:AVCaptureVideoStabilizationModeAuto , but the video still sometimes comes out blurry at the start and at the end (fine in the middle, though), which is very problematic because we grab the first frame as a still image (in order to enable video & photo capabilities without switching camera modes). Placing the device flat on a desk removes all blurriness, so the whole video is sharp throughout. This suggests it has something

Barcode on swift 4

孤者浪人 提交于 2019-11-26 05:32:17
问题 I\'m trying to upgrade mi app to swift 4, but the barcode reader is not working. I have isolated the barcode reader code, and still not working. The camera works but it does not detect the barcode. The code worked just fine on swift 3 iOS 10. This is the complete code import AVFoundation import UIKit class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate { var captureSession: AVCaptureSession! var previewLayer: AVCaptureVideoPreviewLayer! override func viewDidLoad() {

How to apply “filters” to AVCaptureVideoPreviewLayer

怎甘沉沦 提交于 2019-11-26 05:16:59
问题 My app is currently using AVFoundation to take the raw camera data from the rear camera of an iPhone and display it on an AVCaptureVideoPreviewLayer in real time. My goal is to to conditionally apply simple image filters to the preview layer. The images aren\'t saved, so I do not need to capture the output. For example, I would like to toggle a setting that converts the video coming in on the preview layer to Black & White. I found a question here that seems to accomplish something similar by