avcapturesession

Detecting heart rate using the camera

China☆狼群 提交于 2019-12-28 03:18:07
问题 I need the same functionality as the application Instant Heart Rate . The basic process requires the user to: Place the tip of the index finger gently on the camera lens. Apply even pressure and cover the entire lens. Hold it steady for 10 seconds and get the heart rate. This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger. How can I get the light level data from the video capture? Where should I look for this? I looked

Torch in background

断了今生、忘了曾经 提交于 2019-12-25 02:26:32
问题 Is this possible to make torch lighting when app is in background? This is what I do to turn it on: AVCaptureSession *session = [AVCaptureSession new]; AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; [session beginConfiguration]; [device lockForConfiguration:nil]; device.torchMode = AVCaptureTorchModeOn; [device unlockForConfiguration]; [session commitConfiguration]; [session startRunning]; But when app goes to background torch is automatically turned

Capturing images in 24 bpp bitmap from iPhone camera (AVCaptureSession)

人盡茶涼 提交于 2019-12-25 01:28:41
问题 I'm capturing frames from the front camera of the iPhone using AVCaptureSession . Im trying to change the format of the AVCaptureVideoDataOutput so it can capture a 24 bpp bitmap. This code provides me a 32 bpp bitmap without any issues: AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init]; outputDevice.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey]; [outputDevice

Audio output is not working after the screen recording using AVCapture session

杀马特。学长 韩版系。学妹 提交于 2019-12-24 22:09:56
问题 I am using the AVCapturesession to capture the macOS screen. When I connect my external headphone for audio input for recording the audio device is not release after finishing the recording. I am stopping the session completely after the recording. Still, I can't play any audio in the system. When I start the recording again audio output started playing. Please find the code below. I am using the code same as written in aperture library. import AVFoundation enum ApertureError: Error { case

Capture still image from AVCaptureSession in Swift

心不动则不痛 提交于 2019-12-24 02:33:13
问题 I have an AVCaptureSession that displays live video in a UIView, and I want to save a frame of the video stream as a UIImage. I've been dissecting the code I keep seeing around the internet, but I'm having trouble with the first line: if let stillOutput = self.stillImageOutput { // Establish an AVCaptureConnection and capture a still image from it. } This gives me the error 'Camera' does not have a member named 'stillImageOutput'. The code depends on being able to get the video connection

AVCaptureSession VS UIImagePickerController camera preview

大城市里の小女人 提交于 2019-12-23 18:04:09
问题 I'm developing an application similar to Instagram iOS app. Instagram have a custom camera preview. I want to develop something similar and the question is - what to use better for this purpose - UIImagePickerController with custom cameraOverlayView property or should I use AVCaptureSession ? Maybe someone have such experience and can give me an advice. Will be appreciate. 回答1: AVCaptureSession is more customisable than UIImagePickerController. In case of speed, there is not much difference.

AVCaptureDevice's isLowLightBoostSupported always returns false on 5S iOS7.1 (for automaticallyEnablesLowLightBoostWhenAvailable)

独自空忆成欢 提交于 2019-12-23 12:36:59
问题 I'm attempting to enable AVCaptureDevice's automaticallyEnablesLowLightBoostWhenAvailable in an iOS camera app, but I've been utterly unable to make AVCaptureDevice's isLowLightBoostSupported return true. Question: Is there anything that needs to be done to enable the low light boost api beyond locking for configuration? Is there any known reason that isLowLightBoostSupported would always return false (for all devices) on a fully updated, modern, system? I'm testing on a 5S with iOS 7.1. For

Understand if an AVCaptureDeviceFormat suits video recording

久未见 提交于 2019-12-23 08:49:51
问题 In an app that I'm developing I'd like to let the user choose the the resolution of video recording. Due to specification, I can't use AVCaptureSessionPreset constants. Getting format list there are resolutions above 3000px that of course can't work for video grabbing but only for photo shooting. AVCaptureDeviceFormat: 0x17020c830 'vide'/'420f' 3264x2448, { 2- 30 fps}, HRSI:3264x2448, fov:58.040, max zoom:153.00 (upscales @1.00), AF System:2, ISO:29.0-1856.0, SS:0.000013-0.500000 I can't find

How to record my Mac's internal sound, not the microphone!, using AVCaptureSession?

妖精的绣舞 提交于 2019-12-23 05:34:06
问题 I am trying to implement a simple macOS app with screen recording capabilities. I don't want to record a microphone input, but rather a sound that comes out of my Mac's speakers. Example: this way I want to be able to record a YouTube video to a file. Is this possible with AVCaptureSession? Googling shows the examples that capture video and microphore, but not the internal audio. Here is the working code that I have to capture video and microphone. What do I have to modify to disable the

What is the recommended way to deal with AVCaptureVideoDataOutput image data regarding orientation

那年仲夏 提交于 2019-12-23 04:14:15
问题 I am writing an application that does some real-time processing on image data it gets from AVCaptureVideoDataOutput within an AVCaptureSession. I am currently able to start the session, add input & output and then retrieve image data, convert it to UIImage and display it on the screen live. The main problem I'm having is that the image's orientation is awkward. It's rotated and mirrored and it also looks skewed. I've done some research into it, I've found some related questions, and I've