avcapturesession

ios AVCaptureVideoPreviewLayer capture current image

做~自己de王妃 提交于 2019-12-01 01:50:06
Once the default iPhone Camera app takes a photo, a preview appears and the image animates to the camera roll button. I am trying to replicate this animation. session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetPhoto; CALayer *viewLayer = self.vImagePreview.layer; NSLog(@"viewLayer = %@", viewLayer); captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; captureVideoPreviewLayer.frame = CGRectMake(0, 0, 322, 425); [self.vImagePreview.layer addSublayer:captureVideoPreviewLayer]; device = [AVCaptureDevice

AVAudioSession setCategory not working

瘦欲@ 提交于 2019-11-30 21:56:27
I have a video capturing app and I want to be able to play background music while recording audio+video. I can accomplish this if I set the AVAudioSession category to PlayAndRecord in didFinishLaunchingWithOptions . However, this causes a glitch in the audio whenever the view with the camera enters or exits the foreground, and its apparently impossible to get rid of: https://forums.developer.apple.com/message/74778#74778 I can live with the glitch if it just happens when I start/stop recording video, but that means I need to change the AVAudioSession category from Ambient to PlayAndRecord when

Zooming while capturing video using AVCapture in iOS

China☆狼群 提交于 2019-11-30 20:32:16
问题 I am using AVCapture to capture video and save it. But I need to provide zooming option like pinch to zoom or through a zoom button. Also video should be saved in exactly in same manner in which it is being displayed, I mean when zoomed in, it should be saved zoomed. Any help, Link is appreciated. My code for setting up AVCapture session is: - (void)setupAVCapture{ session = [[AVCaptureSession alloc] init]; session.automaticallyConfiguresApplicationAudioSession=YES; [session

iOS 8 iPad AVCaptureMovieFileOutput drops / loses / never gets audio track after 13 - 14 seconds of recording

六月ゝ 毕业季﹏ 提交于 2019-11-30 17:24:13
I have the following code which works for iOS 6 & 7.x. In iOS 8.1 I have a strange issue where if you capture a session for about 13 seconds or longer, the resulting AVAsset only has 1 track (video), the audio track is just not there. If you record for a shorter period the AVAsset has 2 tracks (video and audio) as expected. I have plenty of disk space, the app has permission to use camera and microphone. I created a new project with minimal code, it reproduced the issue. Any ideas would be greatly appreciated. #import "ViewController.h" @interface ViewController () @end @implementation

What is the role of AVCaptureDeviceType.builtInDualCamera

怎甘沉沦 提交于 2019-11-30 15:56:08
I am playing with swift and an iPhone 7 Plus. I am working with builtInWideAngleCamera and builtInTelephotoCamera. This is great, even if i cannot get the 2 images simultaneously. I saw in apple documentation that AVCaptureDeviceType contains a builtInDualCamera entry. What is the purpose of this device in avfoundation, because we cannot do anything (zoom, depth effect) with apple API ? In other word, i cannot see the difference between builtInDualCamera and builtInWideAngleCamera when working with AVCaptureDeviceType, avcapturesession and stuff Thanks Duel-Camera options is to choose the

AVCaptureSession with multiple Outputs?

别来无恙 提交于 2019-11-30 13:42:56
问题 I'm currently developing an iOS app that applies CoreImage to the camera feed in order to take photos and videos, and I've run into a bit of snag. Up till now I've been using AVCaptureVideoDataOutput to obtain the sample buffers and manipulate them with CoreImage, and then displayed a simple preview, as well as using it to capture photos and saving them. When I tried to implement Video Recording, by writing the SampleBuffers to a video as I received them from the AVCaptureVideoDataOutput , it

How can I extract an AVMetadataObject from a UIImage?

随声附和 提交于 2019-11-30 12:13:22
问题 I'd like to use iOS 7's new barcode scanning functionality with a UIImage instead of live capture from one of the device's camera. I already have the detection working fine with an AVCaptureDeviceInput . The best way I think to do this would be to create a concrete subclass of AVCaptureInput that provides media data to an AVCaptureSession from a UIImage . However, I can't find any documentation or examples on how to subclass AVCaptureInput , so I'm at a loss. An alternative would be to

AVCaptureSession get Memory warning and crash with no reason

匆匆过客 提交于 2019-11-30 11:33:26
问题 I am working on an app that manipulates HD photos. I am taking a photo with an AVCaptureSession, stopping it and then apply effects on that photo. The thing that makes me CRAZY is that everything works fine, instruments tells me that I release all the memory I use properly and on time. It goes really high yes, sometimes to 100mb. But it goes down quickly. Then I restart my Capture Session and I got a memory warning. There is absolutely no reason for that ;_; All the memory I used if freed...

iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

狂风中的少年 提交于 2019-11-30 10:24:28
I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html I found that that captureOutput:didOutputSampleBuffer:fromConnection is NOT called and I would like to know why or what I am doing wrong. Here is what I have done: (1) I make the AVCamViewController a delegate @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> (2) I created an AVCaptureVideoDataOutput object and add

How can I get autofocus to work in a second AVCaptureSession without recreating the sessions?

江枫思渺然 提交于 2019-11-30 10:19:53
Autofocus is not working on the first AVCaptureSession when I create a second AVCaptureSession. The second session to be created is the one where autofocus works and the first created one does not autofocus. I would expect that either session would be able to auto focus when started after the other one is stopped in the same way the auto white balance and auto exposure work for both sessions. If you observe the log window with the sample code below you can see the key-value-observing messages coming through; but never the changing focus message when the top session is running. Sidenote: