avcapturesession

Can't use AVCaptureDevice with a flash

早过忘川 提交于 2019-12-01 22:53:12
I am having difficult times, for something which I think ought to be simple. I just want to light the flash when taking a picture in my iOS app. And all I tried failed or works only 20 percent. Here is the code fired to light the flash up: // Here we have: captureDevice.hasFlash && captureDevice.isFlashModeSupported(.On) do {try captureDevice.lockForConfiguration() captureDevice.flashMode = .On captureDevice.unlockForConfiguration() } catch let error as NSError { print("captureDevice.lockForConfiguration FAILED") print(error.code) } I have tried several flavors of the code, by moving the 2

How do I record a video on iOS without using a preset?

一个人想着一个人 提交于 2019-12-01 21:39:55
The simpler way to record a video on iOS is by setting a AVCaptureSession.sessionPreset . But that doesn't work for me since I want to control parameters like binning, stabilization (cinematic, standard, or none) and ISO. I find the format I want and assign it to activeFormat , but when I try to start recording, I get an error: Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureMovieFileOutput startRecordingToOutputFileURL:recordingDelegate:] No active/enabled connections' Here is my initialisation code: let device = AVCaptureDevice.defaultDevice(

AVCAPTURE image orientation

China☆狼群 提交于 2019-12-01 20:53:34
I have a view controller which allows a user to take a picture. I am setting the avcapture bounds to be the bounds of a view on screen. Above this view I have a collection view. So users can capture multiple pictures, they are then added to the collection view above. I am having trouble with the correct orientation appearing in my preview above. Code is as follows: @IBOutlet weak var imagePreviews: UICollectionView! @IBOutlet weak var imgPreview: UIView! var session: AVCaptureSession? var stillImageOutput: AVCaptureStillImageOutput? var videoPreviewLayer: AVCaptureVideoPreviewLayer? var images

Support for background recording of video using AVCaptureSession

谁都会走 提交于 2019-12-01 18:39:40
I am trying to record video also in background but currently my code is recording video in foreground when the app goes background the method. -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error fires immediately with error as error:Error Domain=AVFoundationErrorDomain Code=-11818 "Recording Stopped" UserInfo=0x176aa180 {NSLocalizedRecoverySuggestion=Stop any other actions using the recording device and try again., NSUnderlyingError=0x1766c0e0 "The operation couldn’t

Take ownership of memory from CVImageBufferRef

空扰寡人 提交于 2019-12-01 18:18:22
问题 I am making a simple pipeline that gets images from AVCaptureSession, processes them in OpenCV, and then renders them in OpenGL. It is based on RosyWriter but without the audio and recording capabilities. The OpenCV processing looks like - (void)processPixelBuffer: (CVImageBufferRef)pixelBuffer { CVPixelBufferLockBaseAddress( pixelBuffer, 0 ); int bufferWidth = CVPixelBufferGetWidth(pixelBuffer); int bufferHeight = CVPixelBufferGetHeight(pixelBuffer); unsigned char *pixel = (unsigned char *

Take ownership of memory from CVImageBufferRef

久未见 提交于 2019-12-01 17:44:58
I am making a simple pipeline that gets images from AVCaptureSession, processes them in OpenCV, and then renders them in OpenGL. It is based on RosyWriter but without the audio and recording capabilities. The OpenCV processing looks like - (void)processPixelBuffer: (CVImageBufferRef)pixelBuffer { CVPixelBufferLockBaseAddress( pixelBuffer, 0 ); int bufferWidth = CVPixelBufferGetWidth(pixelBuffer); int bufferHeight = CVPixelBufferGetHeight(pixelBuffer); unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer); cv::Mat image = cv::Mat(bufferWidth,bufferHeight,CV_8UC4,pixel

Is there any way to get frame by frame using AVCaptureSession object in swift?

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-01 07:24:53
问题 I have to process frames which are captured by iPhone camera using my c++ functions. So I use startRunning() function to start the flow of data, but in what way I can process each frame? 回答1: Yes, it is pretty straight forward. You need to Create an AVCaptureVideoDataOutput object to produce video frames Implement a delegate for the AVCaptureVideoDataOutput object to process video frames In the delegate class, implement the method (captureOutput:didOutputSampleBuffer:fromConnection:) that is

AVCaptureSession addInput causing glitch in background audio

╄→尐↘猪︶ㄣ 提交于 2019-12-01 05:32:41
I'm making a video capturing iOS app and I want to be able to record audio from the microphone while allowing background music to play. I can do all of this but the background audio skips (pauses briefly) whenever the view with the camera enters and exits the foreground. I have isolated the bug to AVCaptureSession addInput : AVCaptureSession session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession = NO; AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; AVCaptureDeviceInput *audioDeviceInput =

AVCaptureSession addInput causing glitch in background audio

喜夏-厌秋 提交于 2019-12-01 04:44:03
问题 I'm making a video capturing iOS app and I want to be able to record audio from the microphone while allowing background music to play. I can do all of this but the background audio skips (pauses briefly) whenever the view with the camera enters and exits the foreground. I have isolated the bug to AVCaptureSession addInput : AVCaptureSession session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession = NO; AVCaptureDevice *audioDevice = [AVCaptureDevice

Zooming while capturing video using AVCapture in iOS

强颜欢笑 提交于 2019-12-01 02:01:15
I am using AVCapture to capture video and save it. But I need to provide zooming option like pinch to zoom or through a zoom button. Also video should be saved in exactly in same manner in which it is being displayed, I mean when zoomed in, it should be saved zoomed. Any help, Link is appreciated. My code for setting up AVCapture session is: - (void)setupAVCapture{ session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession=YES; [session beginConfiguration]; session.sessionPreset = AVCaptureSessionPresetMedium; AVCaptureVideoPreviewLayer