How to downscale a UIImage in IOS by the Data size

前端 未结 5 850
时光取名叫无心
时光取名叫无心 2020-12-04 16:30

I am looking to downscale a UIImage in iOS.

I have seen other questions below and their approach on how to downscale the image by size. R

5条回答
  •  萌比男神i
    2020-12-04 16:37

    In your revised question, you clarified that your goal was to say within file size limitations while uploading images. In that case, playing around with JPEG compression options is fine as suggested by rmaddy.

    The interesting question is that you have two variables to play around with, JPEG compression and image dimensions (there are others, too, but I'll keep it simple). How do you want to prioritize one over the other? For example, I don't think it makes sense to keep a full resolution, absurdly compressed image (e.g. 0.1 quality factor). Nor does it make sense to keep a tiny resolution, uncompressed image. Personally, I'd iteratively adjust quality as suggested by rmaddy, but set some reasonable floor (e.g. JPEG quality not less than, say 0.70). At that point, I might consider changing the image dimensions (and that changes file size pretty quickly, too), and altering the dimensions until the resulting NSData was an appropriate size.

    Anyway, in my original answer, I focused on the memory consumption within the app (as opposed to file size). For posterity's sake, see that answer below:


    If you are trying to control how much memory is used when you load the images into UIImage objects to be used in UIKit objects, then playing around with JPEG compression won't help you much, because the internal representation of the images once you load them into UIKit objects is uncompressed. Thus, in that scenario, JPEG compression options doesn't accomplish much (other than sacrificing image quality).

    To illustrate the idea, I have an image that is 1920 x 1080. I have it in PNG format (the file is 629kb), a compressed JPEG format (217kb), and a minimally compressed JPEG format (1.1mb). But, when I load those three different images into UIImageView objects (even if they have a very small frame), Instrument's "Allocations" tool shows me that they're each taking up 7.91mb:

    allocations

    This is because when you load the image into an image view, the internal, uncompressed representation of these three images is four bytes per pixel (a byte for red, one for green, one for blue, and one for alpha). Thus a my 1920 x 1080 images take up 1920 x 1080 x 4 = 8,249,400 = 7.91mb.

    So, if you don't want them to take up more than 500kb in memory when loading them into image view objects, that means that you want to resize them such that the product of the width times the height will be 128,000 or less (i.e. if square, less than 358 x 358 pixels).

    But, if your concern is one of network bandwidth as you upload images or persistent storage capacity, then go ahead and play around with JPEG compression values as suggested by rmaddy's excellent answer. But if you're trying to address memory consumption issues while the images are loaded into UIKit objects, then don't focus on compression, but focus on resizing the image.

提交回复
热议问题