avcapturesession

How to record and save at 240 frames per second?

故事扮演 提交于 2019-12-11 15:30:03
问题 I need to record and save video from an iPhone Xs at the phone's max frame rate (240 fps). The saved file always ends up at 30 fps. I've been through a dozen guides/docs/Stack Overflow posts but have yet to hit on the right solution. I've tested by opening the recorded file in VLC as well as by extracting and counting frames. What am I doing wrong? Environment: Xcode 10.1, build target iOS 12.1, tested on an iPhone Xs running iOS 12.1.2 Here I access the device and configure it for the best

Issue with captureStillImageAsynchronouslyFromConnection for Back camera

眉间皱痕 提交于 2019-12-11 06:24:14
问题 let VideoDevice = CameraWithPosition(AVCaptureDevicePosition.Back) // not working let VideoDevice = CameraWithPosition(AVCaptureDevicePosition.Front) // working if let stillOutput = self.stillImageOutput { if let videoConnection = stillOutput.connectionWithMediaType(AVMediaTypeVideo) { println("stillOutput \(stillOutput)") stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){ (imageSampleBuffer : CMSampleBuffer!, _) in println("imageSampleBuffer \(imageSampleBuffer)") /

Reading on QRCodes on iOS with AVCaptureSession — alignment issues?

左心房为你撑大大i 提交于 2019-12-11 03:01:16
问题 We have implemented a QRCode reading function in iOS using the AVCaptureSession class, as described nicely here: https://github.com/HEmobile/ScanBarCode/tree/master/ScanBarCodes https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureSession_Class/ But one thing we notice... the QRCode has to be aligned exactly vertically or horizontally. Oblique angles such as 45 degress does not trigger a scan. This issue doesn't really google, which is surprising. Our

AVFileCaptureOutput: Not recording at 240 fps

帅比萌擦擦* 提交于 2019-12-11 01:33:40
问题 I seem to be having a problem where I am setting the camera to record at 240 FPS, but for some reason the output file is only 30 FPS. Here is my code for setting up the camera (this is instantiated first): class HFRCamera { public: HFRCamera(); AVCaptureDeviceInput *camera; AVCaptureDeviceInput *microphone; AVCaptureDevice *videoCam; AVCaptureDevice *audioInput; AVCaptureSession *capSession; void start(); void config(); void stop(); }; HFRCamera::HFRCamera() { // Set up capture session and

AVLayerVideoGravityResize does not match on new devices, iOS 10?

空扰寡人 提交于 2019-12-11 00:34:43
问题 Camera with a full screen live preview, previewLayer!.videoGravity = AVLayerVideoGravityResize make an image ... stillImageOutput?.captureStillImageAsynchronously( from: videoConnection, completionHandler: the full-screen live preview will or should precisely match the still image. (For clarity: say you accidentally use AVLayerVideoGravityResizeAspectFill . In that case the live preview will NOT match the still image - you'll see a "jump" as it is stretched.) However... If you try the below

Swift 3: Using AVCaptureAudioDataOutput to analyze audio input

帅比萌擦擦* 提交于 2019-12-10 22:36:10
问题 I’m trying to use AVCaptureAudioDataOutput to analyze audio input, as described here . This is not stuff I could figure out on my own, so I’m copying the example, but I’m having difficulty. Xcode in Swift 3 has prompted me to make a couple of changes. I’m getting a compile error with the line assigning samples . Xcode says, “Cannot invoke initializer for type ‘UnsafeMutablePointer<_> with an argument list of type ‘(UnsafeMutableRawPointer?)’” Here’s the code as I’ve modified it: func

Strange behaviour after modifying exposure duration and going back to AVCaptureExposureModeContinuousAutoExposure

故事扮演 提交于 2019-12-10 20:20:02
问题 I am working on an app that exposes manual controls for the camera with the new APIs introduced in iOS 8, and I am using this sample app from WWDC 2014 as a reference. However I noticed a strange bahaviour (on my 5s and on a 6): after setting the exposure mode to "custom" and then back to "auto" the image continues to lag as if the exposure duration was not affected by this change. Here is the code involved in each step (from the sample app, without any modification): - (IBAction

iOS Capture high resolution photo while using a low AVCaptureSessionPreset for video output

放肆的年华 提交于 2019-12-10 17:40:59
问题 I have pretty much the same question as this one below: Switching AVCaptureSession preset when capturing a photo The issue however is that the (self) answer doesn't help me one bit. I am wondering if someone has a clue as to how to do this. I am capture video frames so I can process them and do something with them. For this, I am using the AVCaptureSessionPrese640x480 as I need all the frame rate I can get while getting a decent frame for computation. Now, when the user wants to capture the

Convert a CMSampleBuffer into a UIImage

痞子三分冷 提交于 2019-12-10 16:27:43
问题 Here's a function (code from Apple documentation) that converts a CMSampleBuffer into a UIImage func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage { // Get a CMSampleBuffer's Core Video image buffer for the media data var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0) // Get the number of bytes per row for the pixel buffer var baseAddress = CVPixelBufferGetBaseAddress

AVCaptureVideoDataOutput and setting kCVPixelBufferWidthKey & kCVPixelBufferHeightKey

混江龙づ霸主 提交于 2019-12-10 13:08:28
问题 I'm trying to capture frames in a specific size from AVCaptureVideoDataOutput by setting kCVPixelBufferWidthKey & kCVPixelBufferHeightKey . Problem is the buffer width and height never change, they always come back 852x640 Here is me code: // Add the video frame output self.videoOutput = [[AVCaptureVideoDataOutput alloc] init]; [videoOutput setAlwaysDiscardsLateVideoFrames:YES]; // Use RGB frames instead of YUV to ease color processing [videoOutput setVideoSettings:[NSDictionary