I have a photo app that is using AV Foundation. I have setup a preview layer using AVCaptureVideoPreviewLayer that takes up the top half of the screen. So when the user is tryin
@Cabus has a solution that works and you should up-vote his answer. However, I did my own version in Swift with the following:
// The image returned in initialImageData will be larger than what
// is shown in the AVCaptureVideoPreviewLayer, so we need to crop it.
let image : UIImage = UIImage(data: initialImageData)!
let originalSize : CGSize
let visibleLayerFrame = self.previewView!.bounds // THE ACTUAL VISIBLE AREA IN THE LAYER FRAME
// Calculate the fractional size that is shown in the preview
let metaRect : CGRect = (self.videoPreviewLayer?.metadataOutputRectOfInterestForRect(visibleLayerFrame))!
if (image.imageOrientation == UIImageOrientation.Left || image.imageOrientation == UIImageOrientation.Right) {
// For these images (which are portrait), swap the size of the
// image, because here the output image is actually rotated
// relative to what you see on screen.
originalSize = CGSize(width: image.size.height, height: image.size.width)
}
else {
originalSize = image.size
}
// metaRect is fractional, that's why we multiply here.
let cropRect : CGRect = CGRectIntegral(
CGRect( x: metaRect.origin.x * originalSize.width,
y: metaRect.origin.y * originalSize.height,
width: metaRect.size.width * originalSize.width,
height: metaRect.size.height * originalSize.height))
let finalImage : UIImage =
UIImage(CGImage: CGImageCreateWithImageInRect(image.CGImage, cropRect)!,
scale:1,
orientation: image.imageOrientation )