Hitting hard memory limit when loading large image, possible to downsample while loading?

僤鯓⒐⒋嵵緔 提交于 2021-01-29 19:29:01

问题


I need to generate small thumbnails from potentially gigantic images (10,000 x 10,000). I'm using ImageMagick and PHP running in Google App Engine. My GAE has a hard memory limit of 512 MB.

I've been trying to read the documentation for the imagemagick php extension, but the docs are pretty thin. I found this SO post that has an answer from the maintainer and tried to adapt their code. I tried to set the resource area so that images which were larger than 2000x2000 pixels would get swapped out to disk instead of held in memory, but either I misunderstand the code or I'm doing something wrong. My GAE still crashes saying that I've filled the memory.

try {
    Imagick::setResourceLimit(Imagick::RESOURCETYPE_AREA, 2000 * 2000);
    Imagick::setResourceLimit(Imagick::RESOURCETYPE_DISK, 1024*1024*1024*2);

    $im = new \Imagick($path);
    
    // ...

} catch (\Exception $e) {
    error_log("Exception: " . $e->getMessage() . "\n");
}

Exceeded hard memory limit of 512 MB with 514 MB...

On Android devices, when loading a Bitmap from disk, there is an option to downsample it as it is being loaded into memory. That allows me to load smaller versions of the image into memory and avoid the problem of gigantic files breaking everything.

Is there something similar in ImageMagick? I saw ImageMagick::setSamplingFactors() but the documentation doesn't explain the parameters at all so I'm not sure how to use it.

How can I generate a tiny thumbnail from a giant image without hitting my hard memory limit on Google App Engine?


回答1:


If you are shrinking JPEG images, then imagemagick supports shrink-on-load. For example, here's a 10k x 10k pixel JPEG image being sized down to 200x200.

$ /usr/bin/time -f %M:%e \
    convert wtc.jpg -resize 200x200 x.jpg
713340:2.98

That's 720MB of peak memory use and almost 3s of CPU time. Now try this:

$ /usr/bin/time -f %M:%e \
    convert -define jpeg:size=400x400 wtc.jpg -resize 200x200 x.jpg
35952:0.32

Down to 35MB of memory and 300ms of CPU.

The -define jpeg:size=400x400 hints to the JPEG loader that you want an image of at least 400x400 pixels, so (in this case) during load, it'll fetch at 1/8th size. You need the load hint size to be at least 2x larger than your final output size to avoid aliasing.

You can set this from imagick with setOption.

Unfortunately, many loaders do not support shrink-on-load. PNG is especially bad:

$ /usr/bin/time -f %M:%e \
    convert wtc.png -resize 200x200 x.jpg
828376:5.62

830MB and 5.6s.

You could consider other resize programs. vipsthumbnail is fast and low-memory for almost all file formats, for example:

$ /usr/bin/time -f %M:%e \
    vipsthumbnail wtc.png --size 200x200 -o x.jpg
58780:2.29

60MB and 2.3s for the same PNG file. Quality is the same as imagemagick.

It has a PHP binding too -- you can write eg.:

$image = Vips\Image::thumbnail('somefile.jpg', 200);
$image->writeToFile('tiny.jpg');


来源:https://stackoverflow.com/questions/62885189/hitting-hard-memory-limit-when-loading-large-image-possible-to-downsample-while

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!