GZIP Compression on static Amazon S3 files

前端 未结 2 483
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-29 07:21

I would like to implement GZIP compression on my site. I\'ve implemented it on IIS and the HTML page is compressed successfully

2条回答
  •  一向
    一向 (楼主)
    2020-12-29 07:37

    If you use CloudFront in front of your S3 bucket, there is no need to manually compress HTML ressources (CloudFront will compress them on-the-fly). Please note CloudFront only compress in gzip (no deflate, brotli) and only CSS / JS / HTML (based on content-type). See https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html#compressed-content-cloudfront-file-types . To make it works, you have to forward some http headers from CloudFront to S3 (see doc).

    If your S3 bucket have resources not supported by Cloudfront (generic "binary/octet-stream" mime type, like "hdr" texture or "nds" ROM), you need to compress them by yourself before uploading to S3, then set the "content-encoding" http meta on the resource. Note that only browsers supporting the gz encoding will be able to download and decompress the file.

    If you don't want to compress the file one-by-one by the hand, you can use a Lambda function

    • triggered on each PUT of an object (a file) in the bucket
    • if the file is not already compressed and if compression is usefull, then replace the original uploaded file with the compressed version
    • set http headers content-encoding to gzip

    I wrote a GIST for this, it can inspire you to create your own process. See https://gist.github.com/psa-jforestier/1c74330df8e0d1fd6028e75e210e5042

    And dont forget to invalidate (=purge) Cloudfront to apply your change.

提交回复
热议问题