avcapture

AVCaptured image size of screen

走远了吗. 提交于 2020-07-10 05:28:43
问题 I'm using AVCapture to create my own camera on iOS. When I click the button to take a photo. I capture the image from the preview layer. Then I assign the captured image to a imageview that I make the size of the screen and place it on top of the preview layer. The problem is the captured image has different proportions to what the ios video camera preview has, so when you click to capture a photo.. You can see the aspect of the photo change. I want the photo to look identical to the preview.

iOS: Torch level on iPhone 11 Pro

亡梦爱人 提交于 2020-06-10 09:59:19
问题 I'm using AVCaptureDevice.setTorchModeOn(level) method to turn on the flashlight at variable brightness. On my old iPhone SE it's working fine — I can clearly see 4 different brightness levels as I change level from 0 to 1 . But on the iPhone 11 Pro the flashlight turns on only when level is 1.0 ! And it's brightness if far from maximum level (compared to flashlight from Control Center). I tried using maxAvailableTorchLevel constant, but results are the same as using 1.0 . Also tried values

Saving video at 120/240fps

穿精又带淫゛_ 提交于 2020-02-06 04:39:33
问题 I'm making an app to record videos at the device maximum frame rate (i.e., 120fps in the iPhone 5s and 240 in the 6 and 6s). I've managed to configure the AVCaptureDevice to set the maxFrameRateDuration , I print to the logs the currentDevice.activeFormat.videoSupportedFrameRateRanges and everything works great. But when I attempt to save the video, it does save it, but at normal frame rate, not at 120 or 240fps. Please, can anyone help me with this? Any help would be much appreciated. Thanks

Swift isLockingFocusWithCustomLensPositionSupported always returns false

陌路散爱 提交于 2020-01-23 17:01:31
问题 I want to set the lens distance of my iPhoneX to a constant. In order to check if that is supported, I check the isLockingFocusWithCustomLensPositionSupported property of my device, as described in the documentation here: https://developer.apple.com/documentation/avfoundation/avcapturedevice/2361529-islockingfocuswithcustomlensposi The method always returns false, even when the device is locked for configuration, which means that calling the method setFocusModeLocked(lensPosition,

Swift isLockingFocusWithCustomLensPositionSupported always returns false

偶尔善良 提交于 2020-01-23 17:01:08
问题 I want to set the lens distance of my iPhoneX to a constant. In order to check if that is supported, I check the isLockingFocusWithCustomLensPositionSupported property of my device, as described in the documentation here: https://developer.apple.com/documentation/avfoundation/avcapturedevice/2361529-islockingfocuswithcustomlensposi The method always returns false, even when the device is locked for configuration, which means that calling the method setFocusModeLocked(lensPosition,

Swift isLockingFocusWithCustomLensPositionSupported always returns false

蹲街弑〆低调 提交于 2020-01-23 17:01:07
问题 I want to set the lens distance of my iPhoneX to a constant. In order to check if that is supported, I check the isLockingFocusWithCustomLensPositionSupported property of my device, as described in the documentation here: https://developer.apple.com/documentation/avfoundation/avcapturedevice/2361529-islockingfocuswithcustomlensposi The method always returns false, even when the device is locked for configuration, which means that calling the method setFocusModeLocked(lensPosition,

Run multiple AVCaptureSessions or add multiple inputs

流过昼夜 提交于 2020-01-20 01:44:30
问题 I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other. To stream the image of one device I use the following code AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureSession *session = [[AVCaptureSession alloc] init]; session addInput:captureInputFront]; session setSessionPreset:AVCaptureSessionPresetMedium]; session

Run multiple AVCaptureSessions or add multiple inputs

痞子三分冷 提交于 2020-01-20 01:41:27
问题 I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other. To stream the image of one device I use the following code AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureSession *session = [[AVCaptureSession alloc] init]; session addInput:captureInputFront]; session setSessionPreset:AVCaptureSessionPresetMedium]; session

Getting actual NSString of AvCaptureVideoDataOutput availableVideoCVPixelFormatTypes

依然范特西╮ 提交于 2020-01-12 14:31:11
问题 I am trying to find the accepted formats on an AVFoundation output: self.theOutput=[[AVCaptureVideoDataOutput alloc]init]; if ([self.theSession canAddOutput:self.theOutput]) [self.theSession addOutput:self.theOutput]; I am then inserting a breakpoint right after and: po [self.theOutput availableVideoCVPixelFormatTypes] and I get this: (NSArray *) $5 = 0x2087ad00 <__NSArrayM 0x2087ad00>( 875704438, 875704422, 1111970369 ) How do I get the string values of these format types? Thanks 回答1: On an

AVCaptureStillImageOutput Image returning upside down

不打扰是莪最后的温柔 提交于 2020-01-01 05:33:40
问题 I am capturing images from the camera using captureStillImageAsynchronouslyFromConnection The images are being saved upside down. I tried saving the image using UIImage *flippedImage = [UIImage imageWithCGImage:image.CGImage scale:image.scale orientation:UIImageOrientationUp] but this orientation is only in the metadata and when I copy the data from the file to process it the processed images are upside down. (Quick look shows the originals upright, but the processed/filtered images are