Fastest real time decompression algorithm

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-06 19:43:17

问题


I'm looking for an algorithm to decompress chunks of data (1k-30k) in real time with minimal overhead. Compression should preferably be fast but isn't as important as decompression speed.

From what I could gather LZO1X would be the fastest one. Have I missed anything? Ideally the algorithm is not under GPL.


回答1:


lz4 is what you're looking for here.

LZ4 is lossless compression algorithm, providing compression speed at 400 MB/s per core, scalable with multi-cores CPU. It features an extremely fast decoder, with speed in multiple GB/s per core, typically reaching RAM speed limits on multi-core systems.




回答2:


Try Google's Snappy.

Snappy is a compression/decompression library. It does not aim for maximum compression, or compatibility with any other compression library; instead, it aims for very high speeds and reasonable compression. For instance, compared to the fastest mode of zlib, Snappy is an order of magnitude faster for most inputs, but the resulting compressed files are anywhere from 20% to 100% bigger. On a single core of a Core i7 processor in 64-bit mode, Snappy compresses at about 250 MB/sec or more and decompresses at about 500 MB/sec or more.




回答3:


When you cannot use GPL licensed code your choice is clear - zlib. Very permissive license, fast compression, fair compression ratio, very fast decompression, works everywhere and ported to every sane language.



来源:https://stackoverflow.com/questions/2479098/fastest-real-time-decompression-algorithm

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!