Scaling images stored at S3

这一生的挚爱 提交于 2019-12-18 11:53:49

问题


I'm in a situation where I need to push image storage for a number of websites out to a service that can scale indefinitely (S3, CloudFiles, etc.). Up until this point we've been able to allow our users to generate custom thumbnail sizes on the fly using Python's Imaging library with some help from sorl-thumbnail in Django.

By moving our images to something like S3, the ability to quickly create thumbnails on the fly is lost. We can either:

  1. Do it slowly by downloading the source from S3 and creating the thumbnail locally
    con: it is slow and bandwidth intensive
  2. Do it upfront by creating a pre-determined set of thumbnail sizes (a'la Flickr) and pushing them all to S3
    con: it limits the sizes that can be generated and stores lots of files that will never be used
  3. Let the browser resize using the height/width attributes on the img tag.
    con: extra bandwidth used by downloading larger than necessary files

At this point #3 looks to be a simple solution to the problem with few drawbacks. Some quick tests and data from this website show that the quality isn't as bad as expected (we could assure the aspect ratio is maintained).

Any suggestions on other options or drawbacks we might not be taking into consideration?

note: The images are digital photos and are only used for display on the web. Sizes would range from 1000-50 pixels in height/width.


回答1:


I would recommend using EC2 to scale the images on demand. Since bandwidth between EC2 and S3 is free and it should be fast I think that eliminates all the problems with solution #1.



来源:https://stackoverflow.com/questions/1563347/scaling-images-stored-at-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!