问题
I am uploading photos to a server with an IOS app. It is important that the photos are uploaded with no loss in quality and are uploaded as jpeg's. My current problem is that the photos upload with no loss of quality but have a larger than expected file size. For example: I uploaded a file through the app and the file size was 4.7 MB. When I Emailed the same photo to myself and selected the "Actual Photo" option for the email, The size of the photo was only 1.7 MB. A side by side comparison revealed no difference in quality.
Here is how I am uploading the files.
ALAssetsLibrary *library = [ALAssetsLibrary new];
[library getImageAtURL:orderImage.imageUrl with completionBlock:^(UIImage *image)
NSData *fileData = UIImageJPEGRepresentation(image, 1.0)
NSURLRequest *request = [self multipartFormRequestWithMethod:@"POST" path:path parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData)
{
[formData appendPartWithFileData:fileData name:@"uploadedfile" fileName:fileName mimeType:mimeType];
[formData appendPartWithFormData:[extraInfo dataUsingEncoding:NSISOLatin2StringEncoding] name:@"extraInfo"];
}];
回答1:
The problem is UIImageJPEGRepresentation. It does not retrieve the original JPEG, but rather creates a new JPEG. And when you use a compressionQuality of 1 (presumably to avoid further image quality loss), it creates this new representation with no compression (generally resulting in a file larger than the original).
I would advise using getBytes to retrieve the original asset, rather than round-tripping it through a UIImage and getting the data via UIImageJPEGRepresentation:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetsLibraryURL resultBlock:^(ALAsset *asset) {
ALAssetRepresentation *representation = [asset defaultRepresentation];
// I generally would write directly to a `NSOutputStream`, but if you want it in a
// NSData, it would be something like:
NSMutableData *data = [NSMutableData data];
// now loop, reading data into buffer and writing that to our data stream
NSError *error;
long long bufferOffset = 0ll;
NSInteger bufferSize = 10000;
long long bytesRemaining = [representation size];
uint8_t buffer[bufferSize];
while (bytesRemaining > 0) {
NSUInteger bytesRead = [representation getBytes:buffer fromOffset:bufferOffset length:bufferSize error:&error];
if (bytesRead == 0) {
NSLog(@"error reading asset representation: %@", error);
return;
}
bytesRemaining -= bytesRead;
bufferOffset += bytesRead;
[data appendBytes:buffer length:bytesRead];
}
// ok, successfully read original asset;
// do whatever you want with it here
} failureBlock:^(NSError *error) {
NSLog(@"error=%@", error);
}];
--
If you're using the Photos framework introduced in iOS 8, can use PHImageManager to get the image data:
PHFetchResult *result = [PHAsset fetchAssetsWithALAssetURLs:@[assetsLibraryURL] options:nil];
PHAsset *asset = [result firstObject];
if (asset) {
PHImageManager *manager = [PHImageManager defaultManager];
[manager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) {
// use `imageData` here
}];
}
来源:https://stackoverflow.com/questions/27709036/uploading-image-with-ios-app-to-server-file-size-is-too-large