avcapturesession

How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

大憨熊 提交于 2019-11-28 18:02:31
Hey there, I am trying to access raw data from iphone camera using AVCaptureSession. I follow the guide provided by Apple ( link here ). The raw data from the samplebuffer is in YUV format ( Am I correct here about the raw video frame format?? ), how to directly obtain the data for Y component out of the raw data stored in the samplebuffer. When setting up the AVCaptureVideoDataOutput that returns the raw camera frames, you can set the format of the frames using code like the following: [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType

Why AVCaptureSession output a wrong orientation?

自作多情 提交于 2019-11-28 16:05:15
So, I followed Apple's instructions to capture video session using AVCaptureSession : http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html . One problem I'm facing is that even though the orientation of the camera / iPhone device is vertical (and the AVCaptureVideoPreviewLayer shows a vertical camera stream), the output image seems to be in the landscape mode. I checked the width and height of imageBuffer inside imageFromSampleBuffer: of the sample code, and I got 640px and 480px respectively. Does anyone know why this's the case? Thanks! Jon Steinmetz Take a look at the header

Dot Product and Luminance/ Findmyicone

瘦欲@ 提交于 2019-11-28 10:31:41
问题 All, I have a basic question that I am struggling with here. When you look at the findmyicone sample code from WWDC 2010, you will see this: static const uint8_t orangeColor[] = {255, 127, 0}; uint8_t referenceColor[3]; // Remove luminance static inline void normalize( const uint8_t colorIn[], uint8_t colorOut[] ) { // Dot product int sum = 0; for (int i = 0; i < 3; i++) sum += colorIn[i] / 3; for (int j = 0; j < 3; j++) colorOut[j] = (float) ((colorIn[j] / (float) sum) * 255); } And then it

Taking photo with custom camera Swift 3

回眸只為那壹抹淺笑 提交于 2019-11-28 09:28:28
in Swift 2.3 I used this code to take a picture in custom camera: func didPressTakePhoto(){ if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent

iOS tap to focus

限于喜欢 提交于 2019-11-28 09:04:11
I used this code to achieve Tap-to-Focus in iOS custom camera App, but it isn't working. Here's the code override func touchesBegan(touches: NSSet, withEvent event: UIEvent) { let touchPer = touches.anyObject() as UITouch let screenSize = UIScreen.mainScreen().bounds.size var focus_x = touchPer.locationInView(self.view).x / screenSize.width var focus_y = touchPer.locationInView(self.view).y / screenSize.height if let device = captureDevice { if(device.lockForConfiguration(nil)) { device.focusMode = AVCaptureFocusMode.ContinuousAutoFocus device.focusPointOfInterest = CGPointMake(focus_x, focus

Why does AVCaptureVideoOrientation landscape modes result in upside down still images?

孤者浪人 提交于 2019-11-28 07:49:32
I am using AVFoundation classes to implement a custom camera in my app. I am only capturing still images, not video. I have everything working but am stumped by something. I take into account the device orientation when a still image is captured and set the videoOrientation of the video connection appropriately. A code snippet: // set the videoOrientation based on the device orientation to // ensure the pic is right side up for all orientations AVCaptureVideoOrientation videoOrientation; switch ([UIDevice currentDevice].orientation) { case UIDeviceOrientationLandscapeLeft: // Not clear why but

iOS Capture “screenshot” of camera controller

喜你入骨 提交于 2019-11-28 06:03:45
问题 In my app I display the camera and I am taking screenshots of certain perts using UIGetScreenImage, (I tried UIGraphicsGetImageFromCurrentImageContext and it works great for screenshots on almost any part of my app but for the camera view it will just return a blank white image) ... Anyways, I fear Apple will reject my app because of UIGetScreenImage... How can I take a "screenshot" of a 50px by 50px box from the upper left corner of the camera without using this method? I searched and all I

Scanning Barcode or QR code in Swift 3.0 using AVFoundation

一个人想着一个人 提交于 2019-11-28 05:05:54
I am following this tutorial and tried to convert codes form Swift 2.0 to 3.0. But when I launched the application, the app doesn't work! I mean, nothing happens! Here is my code: ViewController: class ViewController: UIViewController ,BarcodeDelegate { override func prepare(for segue: UIStoryboardSegue, sender: Any?) { let barcodeViewController: BarcodeViewController = segue.destination as! BarcodeViewController barcodeViewController.delegate = self } func barcodeReaded(barcode: String) { codeTextView.text = barcode print(barcode) } } BarcodeVC: import AVFoundation protocol BarcodeDelegate {

Capture 60fps in iPhone app

元气小坏坏 提交于 2019-11-28 05:03:52
I am working on a project where we will be using iPhones as cameras for capturing a scene. When recording we need to record @60fps and not 30fps (as natively supported). So I am working on an app to do this as the iPhone 4S hardware supports 720p@60fps (if you jailbreak your phone you can achieve this). Does anybody know how to do this in Objective-C on iOS? Today I saw an app out there (slopro) that can record 60fps on non jailbroken phones. Any advice or tips is much appreciated. After some tinkering, this answer has split into two parts: How to capture frames at 60fps The

How do I use the metadataOutputRectOfInterestForRect method and rectOfInterest property to scan a specific area? (QR Code)

风流意气都作罢 提交于 2019-11-28 04:31:36
I am building a QR code scanner with Swift and everything works in that regard. The issue I have is that I am trying to make only a small area of the entire visible AVCaptureVideoPreviewLayer be able to scan QR codes. I have found out that in order to specify what area of the screen will be able to read/capture QR codes I would have to use a property of AVCaptureMetadataOutput called rectOfInterest . The trouble is when I assigned that to a CGRect, I couldn't scan anything. After doing more research online I have found some suggesting that I would need to use a method called