compression

How do I zip a directory of files using C++?

柔情痞子 提交于 2019-11-28 20:53:57
I'm working on a project using C++, Boost, and Qt. I understand how to compress single files and bytestreams using, for example, the qCompress() function in Qt. How do I zip a directory of multiple files, including subdirectories? I am looking for a cross-platform (Mac, Win, Linux) solution; I'd prefer not to fire off a bunch of new processes. Is there a standard way to combine bytestreams from multiple files into a zipped archive, or maybe there is a convenience function or method that would be available in the Boost iostream library? Many thanks for the assistance. Update : The QuaZip

How to efficiently predict if data is compressible

落花浮王杯 提交于 2019-11-28 20:26:48
I want to write a storage backend to store larger chunks of data. The data can be anything, but it is mainly binary files (images, pdfs, jar files) or text files (xml, jsp, js, html, java...). I found most of the data is already compressed. If everything is compressed, about 15% disk space can be saved. I am looking for the most efficient algorithm that can predict with high probability that a chunk of data (let's say 128 KB) can be compressed or not (lossless compression), without having to look at all the data if possible. The compression algorithm will be either LZF, Deflate, or something

Compressing content with PHP ob_start() vs Apache Deflate/Gzip?

耗尽温柔 提交于 2019-11-28 20:02:12
问题 Most sites want to compress their content to save on bandwidth. However, When it comes to apache servers running PHP there are two ways to do it - with PHP or with apache. So which one is faster or easier on your server? For example, in PHP I run the following function at the start of my pages to enable it: /** * Gzip compress page output * Original function came from wordpress.org */ function gzip_compression() { //If no encoding was given - then it must not be able to accept gzip pages if(

Unzip a zip file using zlib

落花浮王杯 提交于 2019-11-28 19:55:16
I have an archive.zip which contains two crypted ".txt" files. I would like to decompress the archive in order to retrieve those 2 files. Here's what I've done so far: FILE *FileIn = fopen("./archive.zip", "rb"); if (FileIn) printf("file opened\n"); else printf("unable to open file\n"); fseek(FileIn, 0, SEEK_END); unsigned long FileInSize = ftell(FileIn); printf("size of input compressed file : %u\n", FileInSize); void *CompDataBuff = malloc(FileInSize); void *UnCompDataBuff = NULL; int fd = open ("archive.zip", O_RDONLY); CompDataBuff = mmap(NULL, FileInSize, PROT_READ | PROT_WRITE, MAP

LZW compression/decompression under low memory conditions

ⅰ亾dé卋堺 提交于 2019-11-28 19:52:53
Can anybody give pointers how I can implement lzw compression/decompression in low memory conditions (< 2k). is that possible? The zlib library that everyone uses is bloated among other problems (for embedded). I am pretty sure it wont work for your case. I had a little more memory maybe 16K and couldnt get it to fit. It allocates and zeros large chunks of memory and keeps copies of stuff, etc. The algorithm can maybe do it but finding existing code is the challenge. I went with http://lzfx.googlecode.com The decompression loop is tiny, it is the older lz type compression that relies on the

Compression algorithm for JSON encoded packets?

倖福魔咒の 提交于 2019-11-28 19:36:23
What would be the best compression algorithm to use to compress packets before sending them over the wire? The packets are encoded using JSON. Would LZW be a good one for this or is there something better? I think two questions will affect your answer: 1) How well can you predict the composition of the data without knowing what will happen on any particular run of the program? For instance, if your packets look like this: { "vector": { "latitude": 16, "longitude": 18, "altitude": 20 }, "vector": { "latitude": -8, "longitude": 13, "altitude": -5 }, [... et cetera ...] } -- then you would

How to compress mp4 video using MediaCodec Android?

こ雲淡風輕ζ 提交于 2019-11-28 19:34:43
In my Android app, I want to compress mp4 video by changing its resolution, bitrate. I don't want to use FFmpeg (because I don't want to use NDK), so I decided to use MediaCodec API. Here are my logical steps: Extract video file with MediaExtractor, then decode data. Create new encoder with my new resolution, bitrate and encode data. Using MediaMuxer to create a new mp4 file. My problem is: I don't know how to setup the connection between output of decoder and input of encoder. I can decode the video to a Surface or encode a new video from a surface. But I don't understand how to connect them.

GZipStream And DeflateStream will not decompress all bytes

て烟熏妆下的殇ゞ 提交于 2019-11-28 18:49:11
问题 I was in need of a way to compress images in .net so i looked into using the .net GZipStream class (or DeflateStream). However i found that decompression was not always successful, sometimes the images would decompress fine and other times i would get a GDI+ error that something is corrupted. After investigating the issue i found that the decompression was not giving back all the bytes it compressed. So if i compressed 2257974 bytes i would sometimes get back only 2257870 bytes (real numbers)

How to use System.IO.Compression to read/write ZIP files?

狂风中的少年 提交于 2019-11-28 18:21:47
I know there are libraries out there for working with ZIP files . And, you can alternatively use the functionality built into Windows for working ZIP files . But, I'm wondering if anyone has worked out how to use the tools built into the System.IO.Compression namespace within .NET for reading/writing ZIP files? Or, is it not possible using only this namespace? UPDATED: I've seem someone comment that the System.IO.Packaging namespace might be usefull with this also. Does anyone know exactly how to do it? MSDN has a complete example http://msdn.microsoft.com/en-us/library/system.io.packaging

What is the best compression algorithm that allows random reads/writes in a file?

試著忘記壹切 提交于 2019-11-28 18:20:00
What is the best compression algorithm that allows random reads/writes in a file? I know that any adaptive compression algorithms would be out of the question. And I know huffman encoding would be out of the question. Does anyone have a better compression algorithm that would allow random reads/writes? I think you could use any compression algorithm if you write it in blocks, but ideally I would not like to have to decompress a whole block at a time. But if you have suggestions on an easy way to do this and how to know the block boundaries, please let me know. If this is part of your solution,