Write UIImage along with metadata (EXIF, GPS, TIFF) in iPhone's Photo library

后端 未结 8 1786
情歌与酒
情歌与酒 2020-11-29 05:12

I am developing a project, where the requirements are: - User will open the camera through the application - Upon capturing an Image, some data will be appended to the captu

相关标签:
8条回答
  • 2020-11-29 05:17

    A piece of this involves generating the GPS metadata. Here's a category on CLLocation to do just that:

    https://gist.github.com/phildow/6043486

    0 讨论(0)
  • 2020-11-29 05:17

    Getting meta data from cam captured image within an application:

    UIImage *pTakenImage= [info objectForKey:@"UIImagePickerControllerOriginalImage"];
    
    NSMutableDictionary *imageMetadata = [[NSMutableDictionary alloc] initWithDictionary:[info objectForKey:UIImagePickerControllerMediaMetadata]];
    

    now to save image to library with extracted metadata:

    ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
    [library writeImageToSavedPhotosAlbum:[sourceImage CGImage] metadata:imageMetadata completionBlock:Nil];
    [library release];
    

    or want to save to local directory

    CGImageDestinationAddImageFromSource(destinationPath,sourceImage,0, (CFDictionaryRef)imageMetadata);
    
    0 讨论(0)
  • 2020-11-29 05:19

    For anyone who comes here trying to take a photo with the camera in your app and saving the image file to the camera roll with GPS metadata, I have a Swift solution that uses the Photos API since ALAssetsLibrary is deprecated as of iOS 9.0.

    As mentioned by rickster on this answer, the Photos API does not embed location data directly into a JPG image file even if you set the .location property of the new asset.

    Given a CMSampleBuffer sample buffer buffer, some CLLocation location, and using Morty’s suggestion to use CMSetAttachments in order to avoid duplicating the image, we can do the following. The gpsMetadata method extending CLLocation can be found here.

    if let location = location {
        // Get the existing metadata dictionary (if there is one)
        var metaDict = CMCopyDictionaryOfAttachments(nil, buffer, kCMAttachmentMode_ShouldPropagate) as? Dictionary<String, Any> ?? [:]
    
        // Append the GPS metadata to the existing metadata
        metaDict[kCGImagePropertyGPSDictionary as String] = location.gpsMetadata()
    
        // Save the new metadata back to the buffer without duplicating any data
        CMSetAttachments(buffer, metaDict as CFDictionary, kCMAttachmentMode_ShouldPropagate)
    }
    
    // Get JPG image Data from the buffer
    guard let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) else {
        // There was a problem; handle it here
    }
    
    // Now save this image to the Camera Roll (will save with GPS metadata embedded in the file)
    self.savePhoto(withData: imageData, completion: completion)
    

    The savePhoto method is below. Note that the handy addResource:with:data:options method is available only in iOS 9. If you are supporting an earlier iOS and want to use the Photos API, then you must make a temporary file and then create an asset from the file at that URL if you want to have the GPS metadata properly embedded (PHAssetChangeRequest.creationRequestForAssetFromImage:atFileURL). Only setting PHAsset’s .location will NOT embed your new metadata into the actual file itself.

    func savePhoto(withData data: Data, completion: (() -> Void)? = nil) {
        // Note that using the Photos API .location property on a request does NOT embed GPS metadata into the image file itself
        PHPhotoLibrary.shared().performChanges({
          if #available(iOS 9.0, *) {
            // For iOS 9+ we can skip the temporary file step and write the image data from the buffer directly to an asset
            let request = PHAssetCreationRequest.forAsset()
            request.addResource(with: PHAssetResourceType.photo, data: data, options: nil)
            request.creationDate = Date()
          } else {
            // Fallback on earlier versions; write a temporary file and then add this file to the Camera Roll using the Photos API
            let tmpURL = URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true).appendingPathComponent("tempPhoto").appendingPathExtension("jpg")
            do {
              try data.write(to: tmpURL)
    
              let request = PHAssetChangeRequest.creationRequestForAssetFromImage(atFileURL: tmpURL)
              request?.creationDate = Date()
            } catch {
              // Error writing the data; photo is not appended to the camera roll
            }
          }
        }, completionHandler: { _ in
          DispatchQueue.main.async {
            completion?()
          }
        })
      }
    

    Aside: If you are just wanting to save the image with GPS metadata to your temporary files or documents (as opposed to the camera roll/photo library), you can skip using the Photos API and directly write the imageData to a URL.

    // Write photo to temporary files with the GPS metadata embedded in the file
    let tmpURL = URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true).appendingPathComponent("tempPhoto").appendingPathExtension("jpg")
    do {
        try data.write(to: tmpURL)
    
        // Do more work here...
    } catch {
        // Error writing the data; handle it here
    }
    
    0 讨论(0)
  • 2020-11-29 05:25

    There are many frameworks that deals with image and metadata.

    Assets Framework is deprecated, and replaced by Photos Library framework. If you implemented AVCapturePhotoCaptureDelegate to capture photos, you can do so:

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        var metadata = photo.metadata
        metadata[kCGImagePropertyGPSDictionary as String] = gpsMetadata
        photoData = photo.fileDataRepresentation(withReplacementMetadata: metadata,
          replacementEmbeddedThumbnailPhotoFormat: photo.embeddedThumbnailPhotoFormat,
          replacementEmbeddedThumbnailPixelBuffer: nil,
          replacementDepthData: photo.depthData)
        ...
    }
    

    The metadata is a dictionary of dictionaries, and you have to refer to CGImageProperties.

    I wrote about this topic here.

    0 讨论(0)
  • 2020-11-29 05:29

    The function: UIImageWriteToSavePhotosAlbum only writes the image data.

    You need to read up on the ALAssetsLibrary

    The method you ultimately want to call is:

     ALAssetsLibrary *library = [[ALAssetsLibrary alloc]
     [library writeImageToSavedPhotosAlbum:metadata:completionBlock];
    
    0 讨论(0)
  • 2020-11-29 05:29

    Here is a slight variation of @matt answer.

    The following code use only one CGImageDestination and more interesting allow to save in HEIC format on iOS11+.

    Notice that the compression quality is added to the metadata before adding the image. 0.8 is roughly the compression quality of native camera save.

    //img is the UIImage and metadata the metadata received from the picker
    NSMutableDictionary *meta_plus = metadata.mutableCopy;
    //with CGimage, one can set compression quality in metadata
    meta_plus[(NSString *)kCGImageDestinationLossyCompressionQuality] = @(0.8);
    NSMutableData *img_data = [NSMutableData new];
    NSString *type;
    if (@available(iOS 11.0, *)) type = AVFileTypeHEIC;
    else type = @"public.jpeg";
    CGImageDestinationRef dest = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)img_data, (__bridge CFStringRef)type, 1, nil);
    CGImageDestinationAddImage(dest, img.CGImage, (__bridge CFDictionaryRef)meta_plus);
    CGImageDestinationFinalize(dest);
    CFRelease(dest); //image is in img_data
    //go for the PHLibrary change request
    
    0 讨论(0)
提交回复
热议问题