I am developing a project, where the requirements are: - User will open the camera through the application - Upon capturing an Image, some data will be appended to the captu
Apple has updated their article addressing this issue (Technical Q&A QA1622). If you're using an older version of Xcode, you may still have the article that says, more or less, tough luck, you can't do this without low-level parsing of the image data.
https://developer.apple.com/library/ios/#qa/qa1622/_index.html
I adapted the code there as follows:
- (void) saveImage:(UIImage *)imageToSave withInfo:(NSDictionary *)info
{
// Get the assets library
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Get the image metadata (EXIF & TIFF)
NSMutableDictionary * imageMetadata = [[info objectForKey:UIImagePickerControllerMediaMetadata] mutableCopy];
// add GPS data
CLLocation * loc = <•••>; // need a location here
if ( loc ) {
[imageMetadata setObject:[self gpsDictionaryForLocation:loc] forKey:(NSString*)kCGImagePropertyGPSDictionary];
}
ALAssetsLibraryWriteImageCompletionBlock imageWriteCompletionBlock =
^(NSURL *newURL, NSError *error) {
if (error) {
NSLog( @"Error writing image with metadata to Photo Library: %@", error );
} else {
NSLog( @"Wrote image %@ with metadata %@ to Photo Library",newURL,imageMetadata);
}
};
// Save the new image to the Camera Roll
[library writeImageToSavedPhotosAlbum:[imageToSave CGImage]
metadata:imageMetadata
completionBlock:imageWriteCompletionBlock];
[imageMetadata release];
[library release];
}
and I call this from
imagePickerController:didFinishPickingMediaWithInfo:
which is the delegate method for the image picker.
I use a helper method (adapted from GusUtils) to build a GPS metadata dictionary from a location:
- (NSDictionary *) gpsDictionaryForLocation:(CLLocation *)location
{
CLLocationDegrees exifLatitude = location.coordinate.latitude;
CLLocationDegrees exifLongitude = location.coordinate.longitude;
NSString * latRef;
NSString * longRef;
if (exifLatitude < 0.0) {
exifLatitude = exifLatitude * -1.0f;
latRef = @"S";
} else {
latRef = @"N";
}
if (exifLongitude < 0.0) {
exifLongitude = exifLongitude * -1.0f;
longRef = @"W";
} else {
longRef = @"E";
}
NSMutableDictionary *locDict = [[NSMutableDictionary alloc] init];
[locDict setObject:location.timestamp forKey:(NSString*)kCGImagePropertyGPSTimeStamp];
[locDict setObject:latRef forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
[locDict setObject:[NSNumber numberWithFloat:exifLatitude] forKey:(NSString *)kCGImagePropertyGPSLatitude];
[locDict setObject:longRef forKey:(NSString*)kCGImagePropertyGPSLongitudeRef];
[locDict setObject:[NSNumber numberWithFloat:exifLongitude] forKey:(NSString *)kCGImagePropertyGPSLongitude];
[locDict setObject:[NSNumber numberWithFloat:location.horizontalAccuracy] forKey:(NSString*)kCGImagePropertyGPSDOP];
[locDict setObject:[NSNumber numberWithFloat:location.altitude] forKey:(NSString*)kCGImagePropertyGPSAltitude];
return [locDict autorelease];
}
So far this is working well for me on iOS4 and iOS5 devices.
Update: and iOS6/iOS7 devices. I built a simple project using this code:
https://github.com/5teev/MetaPhotoSave
The problem we are trying to solve is: the user has just taken a picture with the UIImagePickerController camera. What we get is a UIImage. How do we fold metadata into that UIImage as we save it into the camera roll (photo library), now that we don't have the AssetsLibrary framework?
The answer (as far as I can make out) is: use the ImageIO framework. Extract the JPEG data from the UIImage, use it as a source and write it and the metadata dictionary into the destination, and save the destination data as a PHAsset into the camera roll.
In this example, im
is the UIImage and meta
is the metadata dictionary:
let jpeg = UIImageJPEGRepresentation(im, 1)!
let src = CGImageSourceCreateWithData(jpeg as CFData, nil)!
let data = NSMutableData()
let uti = CGImageSourceGetType(src)!
let dest = CGImageDestinationCreateWithData(data as CFMutableData, uti, 1, nil)!
CGImageDestinationAddImageFromSource(dest, src, 0, meta)
CGImageDestinationFinalize(dest)
let lib = PHPhotoLibrary.shared()
lib.performChanges({
let req = PHAssetCreationRequest.forAsset()
req.addResource(with: .photo, data: data as Data, options: nil)
})
A good way to test — and a common use case — is to receive the photo metadata from the UIImagePickerController delegate info
dictionary thru the UIImagePickerControllerMediaMetadata
key and fold it into the PHAsset as we save it into the photo library.