问题
With this question I only ask for the possibilities I have with Xcode and iOS without external libraries. I am already exploring the possibility of using libtiff
in another question.
Problem
I have been sieving stack overflow for weeks and found working solutions for every one of my problems on its own. I have 4 things that need to work:
- I need the RGBA data as it comes from the camera, no compression whatsoever
- I need as much metadata as possible, especially EXIF
- I need to save in TIFF format for compatibility with other software and losslessness
- I need protection from casual viewing by saving in a file, not the photo library
I can have 2 and 4 by using JPEG. I can have 1, 3 and 4 with raw data (respectively NSData) made from the camera buffer. Can I have all 4 of my prerequisites with Xcode and iOS? I am about to give up and looking for your input as a last resort.
While still exploring this, I am also stuck on the other avenue I tried, libtiff. I am still trying, though...
Here is the list of great advice I have tried, my own code is just put together from stack overflow sources like these:
- How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG) (makes me wish I could use JPEG format, it is so effortless when doing what Apple prefers)
- Raw image data from camera like “645 PRO” (this would be the point to use e.g. libtiff)
- Saving CGImageRef to a png file? (works with
kUTTypeTIFF
, too, but no metadata)
Solution
The complete sequence of actions from captureStillImageAsynchronouslyFromConnection
:
[[self myAVCaptureStillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
//get all the metadata in the image
CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
// get image reference
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
// >>>>>>>>>> lock buffer address
CVPixelBufferLockBaseAddress(imageBuffer, 0);
//Get information about the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// create suitable color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
//Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// <<<<<<<<<< unlock buffer address
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
// release color space
CGColorSpaceRelease(colorSpace);
//Create a CGImageRef from the CVImageBufferRef
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
// release context
CGContextRelease(newContext);
// create destination and write image with metadata
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:filePath isDirectory:NO];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypeTIFF, 1, NULL);
CGImageDestinationAddImage(destination, imageRef, metadata);
// finalize and release destination
CGImageDestinationFinalize(destination);
CFRelease(destination);
}
The still image output related camera settings were:
[[self myAVCaptureSession] setSessionPreset:AVCaptureSessionPresetPhoto];
and
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil];
[myAVCaptureStillImageOutput setOutputSettings:outputSettings];
I get a nice nominally uncompressed image in nominal TIFF format with all metadata. (It is mirrored on other systems, but now I can write the EXIF and other metadata, I can also fine-tune that, I am sure).
Thanks again to Wildaker for his help!
回答1:
As you've already cracked 1, 3 and 4, it seems the only hurdle you're missing is saving the data and metadata together. Try this (assuming the unprocessed data is in a CMSampleBufferRef
called myImageDataSampleBuffer
and you've done the heavy lifting of putting the graphical data into a CGImageRef
called myImage
):
CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
myImageDataSampleBuffer,
kCMAttachmentMode_ShouldPropagate);
NSFileManager* fm = [[NSFileManager alloc] init];
NSURL* pathUrl = [fm URLForDirectory:saveDir
inDomain:NSUserDomainMask
appropriateForURL:nil
create:YES
error:nil];
NSURL* saveUrl = [pathUrl URLByAppendingPathComponent:@"myfilename.tif"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)saveUrl,
(CFStringRef)@"public.tiff", 1, NULL);
CGImageDestinationAddImage(destination, myImage, metadata);
CGImageDestinationFinalize(destination);
CFRelease(destination);
回答2:
This thread was very helpful in resolving a very similar problem, so I thought I'd contribute a Swift 2.0 implementation of the solution in case someone comes looking.
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (imageDataSampleBuffer, error) -> Void in
// get image meta data (EXIF, etc)
let metaData: CFDictionaryRef? = CMCopyDictionaryOfAttachments( kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate )
// get reference to image
guard let imageBuffer = CMSampleBufferGetImageBuffer( imageDataSampleBuffer ) else { return }
// lock the buffer
CVPixelBufferLockBaseAddress( imageBuffer, 0 )
// read image properties
let baseAddress = CVPixelBufferGetBaseAddress( imageBuffer )
let bytesPerRow = CVPixelBufferGetBytesPerRow( imageBuffer )
let width = CVPixelBufferGetWidth( imageBuffer )
let height = CVPixelBufferGetHeight( imageBuffer )
// color space
let colorSpace = CGColorSpaceCreateDeviceRGB()
// context - camera output settings kCVPixelFormatType_32BGRA
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue).union(.ByteOrder32Little)
let newContext = CGBitmapContextCreate( baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue )
//unlock buffer
CVPixelBufferUnlockBaseAddress( imageBuffer, 0 )
//Create a CGImageRef from the CVImageBufferRef
guard let newImage = CGBitmapContextCreateImage( newContext ) else {
return
}
// create tmp file and write image with metadata
let fileName = String(format: "%@_%@", NSProcessInfo.processInfo().globallyUniqueString, "cap.tiff")
let fileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).URLByAppendingPathComponent(fileName)
if let destination = CGImageDestinationCreateWithURL( fileURL, kUTTypeTIFF, 1, nil) {
CGImageDestinationAddImage( destination, newImage, metaData )
let wrote = CGImageDestinationFinalize( destination )
if !wrote || NSFileManager.defaultManager().fileExistsAtPath(fileURL.URLString) {
return
}
}
}
p.s. for this to work, you have to configure your image buffer like this:
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA) ]
来源:https://stackoverflow.com/questions/17361100/how-to-save-a-tiff-photo-from-avfoundations-capturestillimageasynchronouslyfromc