avcapturesession

Alternatives to creating an openGL texture from a captured video frame to overlay an openGL view over video? (iPhone)

和自甴很熟 提交于 2019-12-18 11:44:41
问题 This is mostly relevant for augmented reality type applications. Apple provides information on how to capture video frames (and save them as images if need be) with AVCaptureSession here: http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html I know that it is possible to create an openGL texture out a captured video frame and then use that as a background in the openGL view over which to overlay other graphics. I am wondering if there are any alternatives to this method? The method

How to calculate FOV?

折月煮酒 提交于 2019-12-18 11:03:07
问题 Initial Context I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it) The goal is to make a "degree ruler" relevant to reality like the following: I am already using AVCaptureSession to display camera stream ; and a path coupled with a CAShapeLayer to draw the ruler. This is working pretty good, but now I have to use

Allows music playback during recording video like snapchat ios

好久不见. 提交于 2019-12-18 10:56:24
问题 First I want to discuss the scenario happening in snapchat. In snapchat when you start recording video and song is running in background; It allows continue that song while recording video and after you record that video you can also able to hear that song in background. I am using SCRecorder for recording video and capture image with my custom layout . Now in that I want to do just like above scenario but problem is whenever I start recording the video in background song is stopped playing.

AVCapture capturing and getting framebuffer at 60 fps in iOS 7

余生颓废 提交于 2019-12-18 10:16:14
问题 I'm developping an app which requires capturing framebuffer at as much fps as possible. I've already figured out how to force iphone to capture at 60 fps but - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection method is being called only 15 times a second, which means that iPhone downgrades capture output to 15 fps. Has anybody faced such problem? Is there any possibility to increase

Why AVCaptureSession output a wrong orientation?

北城以北 提交于 2019-12-17 23:44:12
问题 So, I followed Apple's instructions to capture video session using AVCaptureSession : http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html. One problem I'm facing is that even though the orientation of the camera / iPhone device is vertical (and the AVCaptureVideoPreviewLayer shows a vertical camera stream), the output image seems to be in the landscape mode. I checked the width and height of imageBuffer inside imageFromSampleBuffer: of the sample code, and I got 640px and 480px

Taking photo with custom camera Swift 3

懵懂的女人 提交于 2019-12-17 19:00:21
问题 in Swift 2.3 I used this code to take a picture in custom camera: func didPressTakePhoto(){ if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let

How do I use the metadataOutputRectOfInterestForRect method and rectOfInterest property to scan a specific area? (QR Code)

老子叫甜甜 提交于 2019-12-17 17:37:23
问题 I am building a QR code scanner with Swift and everything works in that regard. The issue I have is that I am trying to make only a small area of the entire visible AVCaptureVideoPreviewLayer be able to scan QR codes. I have found out that in order to specify what area of the screen will be able to read/capture QR codes I would have to use a property of AVCaptureMetadataOutput called rectOfInterest . The trouble is when I assigned that to a CGRect, I couldn't scan anything. After doing more

AVCaptureSession specify resolution and quality of captured images obj-c iphone app

蹲街弑〆低调 提交于 2019-12-17 15:52:08
问题 Hi I want to setup AV capture session to capture images with specific resolution (and, if possible, with specific quality) using iphone camera. here's setupping AV session code // Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil; // Create the session self.captureSession = [[AVCaptureSession alloc] init]; // Configure the session to produce lower resolution video frames, if your // processing algorithm can cope. We'll specify

AVCaptureSession specify resolution and quality of captured images obj-c iphone app

China☆狼群 提交于 2019-12-17 15:51:34
问题 Hi I want to setup AV capture session to capture images with specific resolution (and, if possible, with specific quality) using iphone camera. here's setupping AV session code // Create and configure a capture session and start it running - (void)setupCaptureSession { NSError *error = nil; // Create the session self.captureSession = [[AVCaptureSession alloc] init]; // Configure the session to produce lower resolution video frames, if your // processing algorithm can cope. We'll specify

How to capture picture with AVCaptureSession in Swift?

匆匆过客 提交于 2019-12-17 15:23:53
问题 I have a UIViewController in which I use AVCaptureSession to show the camera and it is working just fine and fast. I placed a UIButton object on top of this camera view and added a IBAction for the button. This is how it looks like right now: Now I want to get the picture of the current camera view when the user taps the button: @IBAction func takePicture(sender: AnyObject) { // omg, what do do?! } I have no idea whatsoever on how I can do that. I imagined there could have been something like