avcapturesession

AVCaptureVideoPreviewLayer orientation - need landscape

Deadly 提交于 2019-11-27 11:05:42
My app is landscape only. I'm presenting the AVCaptureVideoPreviewLayer like this: self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [self.previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect]; NSLog(@"previewView: %@", self.previewView); CALayer *rootLayer = [self.previewView layer]; [rootLayer setMasksToBounds:YES]; [self.previewLayer setFrame:[rootLayer bounds]]; NSLog(@"previewlayer: %f, %f, %f, %f", self.previewLayer.frame.origin.x, self.previewLayer.frame.origin.y, self

Detecting heart rate using the camera

假如想象 提交于 2019-11-27 10:16:52
I need the same functionality as the application Instant Heart Rate . The basic process requires the user to: Place the tip of the index finger gently on the camera lens. Apply even pressure and cover the entire lens. Hold it steady for 10 seconds and get the heart rate. This can be accomplished by turning the flash on and watch the light change as the blood moves through the index finger. How can I get the light level data from the video capture? Where should I look for this? I looked through the class AVCaptureDevice but didn't find anything useful. I also found

iOS tap to focus

若如初见. 提交于 2019-11-27 02:41:17
问题 I used this code to achieve Tap-to-Focus in iOS custom camera App, but it isn't working. Here's the code override func touchesBegan(touches: NSSet, withEvent event: UIEvent) { let touchPer = touches.anyObject() as UITouch let screenSize = UIScreen.mainScreen().bounds.size var focus_x = touchPer.locationInView(self.view).x / screenSize.width var focus_y = touchPer.locationInView(self.view).y / screenSize.height if let device = captureDevice { if(device.lockForConfiguration(nil)) { device

Why does AVCaptureVideoOrientation landscape modes result in upside down still images?

被刻印的时光 ゝ 提交于 2019-11-27 01:59:24
问题 I am using AVFoundation classes to implement a custom camera in my app. I am only capturing still images, not video. I have everything working but am stumped by something. I take into account the device orientation when a still image is captured and set the videoOrientation of the video connection appropriately. A code snippet: // set the videoOrientation based on the device orientation to // ensure the pic is right side up for all orientations AVCaptureVideoOrientation videoOrientation;

AVCaptureSession and background audio iOS 7

拟墨画扇 提交于 2019-11-27 01:09:36
Whenever I start an AVCaptureSession running with the microphone as an input it cancels whatever background music is currently running (iPod music for instance). If I comment out the line adding the audio input, the background audio continues. Does anyone know a way to record video clips with the microphone while continuing to allow background audio to play? Also there is error, when you trying to record video and the music is currently playing. A tried to do like this: [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil]; UInt32 doSetProperty = 1;

Switch cameras with avcapturesession

半腔热情 提交于 2019-11-27 00:53:39
Using this tutorial here: http://www.musicalgeometry.com/?p=1297 I have created a custom overlay and image capture with AVCaptureSession . I am attempting to allow the user to switch between the front and back camera. Here is my code in CaptureSessionManager to switch cameras: - (void)addVideoInputFrontCamera:(BOOL)front { NSArray *devices = [AVCaptureDevice devices]; AVCaptureDevice *frontCamera; AVCaptureDevice *backCamera; for (AVCaptureDevice *device in devices) { //NSLog(@"Device name: %@", [device localizedName]); if ([device hasMediaType:AVMediaTypeVideo]) { if ([device position] ==

Scanning Barcode or QR code in Swift 3.0 using AVFoundation

∥☆過路亽.° 提交于 2019-11-27 00:47:37
问题 I am following this tutorial and tried to convert codes form Swift 2.0 to 3.0. But when I launched the application, the app doesn't work! I mean, nothing happens! Here is my code: ViewController: class ViewController: UIViewController ,BarcodeDelegate { override func prepare(for segue: UIStoryboardSegue, sender: Any?) { let barcodeViewController: BarcodeViewController = segue.destination as! BarcodeViewController barcodeViewController.delegate = self } func barcodeReaded(barcode: String) {

Simplified screen capture: record video of only what appears within the layers of a UIView?

妖精的绣舞 提交于 2019-11-26 23:38:02
问题 This SO answer addresses how to do a screen capture of a UIView . We need something similar, but instead of a single image, the goal is to produce a video of everything appearing within a UIView over 60 seconds -- conceptually like recording only the layers of that UIView , ignoring other layers. Our video app superimposes layers on whatever the user is recording, and the ultimate goal is to produce a master video merging those layers with the original video. However, using

Avoiding blurriness at start & end of video (even after using setPreferredVideoStabilizationMode:AVCaptureVideoStabilizationModeAuto)?

好久不见. 提交于 2019-11-26 23:14:23
We capture video on iOS while using setPreferredVideoStabilizationMode:AVCaptureVideoStabilizationModeAuto , but the video still sometimes comes out blurry at the start and at the end (fine in the middle, though), which is very problematic because we grab the first frame as a still image (in order to enable video & photo capabilities without switching camera modes). Placing the device flat on a desk removes all blurriness, so the whole video is sharp throughout. This suggests it has something to do with video stabilization, but is there another property to set? Does locking the focus mode

Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession)

时光总嘲笑我的痴心妄想 提交于 2019-11-26 20:16:16
问题 I am using standard AVFoundation classes to capture video and show preview (http://developer.apple.com/library/ios/#qa/qa1702/_index.html) Here is my code: - (void)setupCaptureSession { NSError *error = nil; [self setCaptureSession: [[AVCaptureSession alloc] init]]; self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device