Comparison between lz4 vs lz4_hc vs blosc vs snappy vs fastlz

后端 未结 4 1577
粉色の甜心
粉色の甜心 2021-02-01 21:13

I have a large file of size 500 mb to compress in a minute with the best possible compression ratio. I have found out these algorithms to be suitable for my use.

4条回答
  •  自闭症患者
    2021-02-01 21:59

    Like most questions, the answer usually ends up being: It depends :)

    The other answers gave you good pointers, but another thing to take into account is RAM usage in both compression and decompression stages, as well as decompression speed in MB/s.

    Decompression speed is typically inversely proportional to the compression ratio, so you may think you chose the perfect algorithm to save some bandwidth/disk storage, but then whatever is consuming that data downstream now has to spend much more time, CPU cycles and/or RAM to decompress. And RAM usage might seem inconsequential, but maybe the downstream system is an embedded/low-voltage system? Maybe RAM is plentiful, but CPU is limited? All those things need to be taken into account.

    Here's an example of a suite of benchmarks done on various algorithms, taking a lot of these considerations into account:

    https://catchchallenger.first-world.info/wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO

提交回复
热议问题