avcam

Check which camera is currently in use in iOS Application

会有一股神秘感。 提交于 2019-12-13 06:20:38
问题 I'm writing an app which has a custom made view for taking photos with the Camera, similar to Apple's AVCam. In it, I want to make a button disappear and re-appear for the flash icon every time the camera is switched. IE When using the front camera, the flash button shouldn't be there and when using the back it should! My code for this at the moment is: AVCaptureDevicePosition position = [[videoInput device] position]; if (position == AVCaptureDevicePositionBack) { self.flashButton.hidden ==

iPhone App - Show AVFoundation video on landscape mode

若如初见. 提交于 2019-12-06 03:05:21
I am using the AVCam example App from Apple. This example uses AVFoundation in order to show video on a view. I am trying to make from the AVCam a landscape App with no luck. When screen orientation changes the video is shown rotated on the view. Is there a way of handling this problem? Samssonart When you create your preview layer: captureVideoPreviewLayer.orientation = UIInterfaceOrientationLandscapeLeft; And the methods to manage rotations: -(void)willAnimateRotationToInterfaceOrientation: (UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { [CATransaction

iPhone Camera Focussing

蓝咒 提交于 2019-12-05 02:25:43
问题 I used the below code for focusing the iphone camera. But it is not working. I take this code from the AVCam sample code of Apple. Am I doing anything wrong? Is there any method to detect if the iPhone did focussing? -(void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];; if (device != nil) { if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { NSError *error; if (

Converting BGRA to ARGB

◇◆丶佛笑我妖孽 提交于 2019-12-04 14:31:21
问题 I'm reading this tutorial on getting pixel data from the iPhone camera. While I have no issue running and using this code, I need to take the output of the camera data (which comes in BGRA) and convert it to ARGB so that I can use it with an external library. How do I do this? 回答1: If you're on iOS 5.0, you can use vImage within the Accelerate framework to do a NEON-optimized color component swap using code like the following (drawn from Apple's WebCore source code): vImage_Buffer src; src

iPhone Camera Focussing

僤鯓⒐⒋嵵緔 提交于 2019-12-03 17:13:14
I used the below code for focusing the iphone camera. But it is not working. I take this code from the AVCam sample code of Apple. Am I doing anything wrong? Is there any method to detect if the iPhone did focussing? -(void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];; if (device != nil) { if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { NSError *error; if ([device lockForConfiguration:&error]) { [device setFocusPointOfInterest:point]; [device setFocusMode

AVCam not in fullscreen

做~自己de王妃 提交于 2019-12-03 14:00:47
I've integrated AVCam in my iOS app. The problem is in iPhone 4 the preview frame isn't fullscreen, it has empty side borders... How can I solve this? Thanks. You need to use the videoGravity property of the AVCaptureVideoPreviewLayer . Take a look to the doc . You need to use: AVLayerVideoGravityResizeAspectFill Edit: Based on that the solution founded by the asker is: - (void)setSession:(AVCaptureSession *)session { ((AVPlayerLayer *)[self layer]).videoGravity = AVLayerVideoGravityResizeAspectFill; ((AVPlayerLayer *)[self layer]).bounds = ((AVPlayerLayer *)[self layer]).bounds; [

Converting BGRA to ARGB

纵饮孤独 提交于 2019-12-03 08:35:49
I'm reading this tutorial on getting pixel data from the iPhone camera. While I have no issue running and using this code, I need to take the output of the camera data (which comes in BGRA) and convert it to ARGB so that I can use it with an external library. How do I do this? If you're on iOS 5.0, you can use vImage within the Accelerate framework to do a NEON-optimized color component swap using code like the following (drawn from Apple's WebCore source code ): vImage_Buffer src; src.height = height; src.width = width; src.rowBytes = srcBytesPerRow; src.data = srcRows; vImage_Buffer dest;

Consecutive calls to startRecordingToOutputFileURL:

∥☆過路亽.° 提交于 2019-12-03 05:53:16
问题 The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error... The operation couldn‚Äôt be completed. (OSStatus error -12780.) Info dictionary is: { AVErrorRecordingSuccessfullyFinishedKey = 0; } (funky single quote in "couldn't" comes from logging [error localizedDescription]) Here's the code, which is basically tweaks to WWDC10 AVCam

Consecutive calls to startRecordingToOutputFileURL:

爷,独闯天下 提交于 2019-12-02 20:33:57
The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error... The operation couldn’t be completed. (OSStatus error -12780.) Info dictionary is: { AVErrorRecordingSuccessfullyFinishedKey = 0; } (funky single quote in "couldn't" comes from logging [error localizedDescription]) Here's the code, which is basically tweaks to WWDC10 AVCam sample: 1) Start recording. Start timer to change the output URL every few seconds - (void)

How to record video in ProRes codec on iOS?

人走茶凉 提交于 2019-11-30 17:17:22
I would like to capture video from device’s back camera directly to ProRes codec, now that .proRes422 and .proRes4444 are available as AVVideoCodecType options in iOS 11. But I receive an error that recording is: unsupported given the current configuration On both iPhone X and second generation iPad Pro, when trying to capture video with the following code: movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecType.proRes422], for: movieFileOutputConnection!) If this approach is wrong, can the captured video be encoded by using AVCaptureVideoDataOutput alongside AVAssetWriter ? I