compression

Get compression ratio of image

◇◆丶佛笑我妖孽 提交于 2019-12-11 03:05:40
问题 I use LibJPEG to read JPEG compressed images. Is there a way to get the current compression ratio of the unchanged image? 回答1: Do you mean the encoding quality, usually a number 0-100? That isn't stored - it's used as a guide for the image encode as to how accurate the waves should be, and it's then discarded. There's no field for it in any of the JFIF header structures. As far as I can see there's no formal definition of what this number means across encoders and so you can't precisely

Read simple/bz2-compressed-file(line by line) by detecting it is compressed or not (size of file is large)

巧了我就是萌 提交于 2019-12-11 02:59:12
问题 I wrote a code to read simple-text/bz2-compressed-file. I used magic-characters of bz2 file to detect the file is compressed or not NOTE "user may or may not provide file with proper extension" my code #include <iostream> #include <sstream> #include <vector> #include <boost/iostreams/filtering_stream.hpp> #include <boost/iostreams/copy.hpp> #include <boost/iostreams/filter/bzip2.hpp> // compile using // g++ -std=c++11 code.cpp -lboost_iostreams // run using // ./a.out < compressed_file // ./a

Compress a file with GZipStream while maintaining its meta-data

五迷三道 提交于 2019-12-11 02:52:40
问题 How can I get the extension of compressed file after being compressed with System.IO.Compression.GZipStream ? For example, if the original file is named test.doc and compresses to test.gz , how do I know what file extension to use when decompressing? 回答1: I had to do this some time ago. The solution is to use the J# libraries to do it. You still write it in C# however. http://msdn.microsoft.com/en-us/magazine/cc164129.aspx That's microsofts answer on the topic. 回答2: There is no way to get the

Lossless data compression in C without dynamic memory allocation

浪子不回头ぞ 提交于 2019-12-11 02:27:36
问题 I'm currently trying to implement a lossless data compression algorithm for a project I'm working on. The goal is to compress a fixed size list of floating point values. The code has to be written in C and can NOT use dynamic memory allocation. This hurts me greatly since most, if not all, lossless algorithms require some dynamic allocation. Two of the main algorithms I've been looking into are Huffman's and Arithmetic. Would this task be possible without dynamic memory allocation? Are there

iOS (objective c) compression_decode_buffer() returns zero

你说的曾经没有我的故事 提交于 2019-12-11 02:19:35
问题 I'm converting a very large json result on my server to a compressed format that I can decompress on my objective c app. I would prefer to use the iOS 9 compression lib if possible (libcompression.tbd), described in apple's CompressionSample/BlockCompression.c sample code. I'm passing the compressed NSData result to the following method: #include "compression.h" ... - (NSData *) getDecompressedData:(NSData *) compressed { size_t dst_buffer_size = 20000000; //20MB uint8_t *dst_buffer = malloc

Streaming decompression of zip archives in python

有些话、适合烂在心里 提交于 2019-12-11 02:17:10
问题 Is there a way to do streaming decompression of single-file zip archives? I currently have arbitrarily large zipped archives (single file per archive) in s3. I would like to be able to process the files by iterating over them without having to actually download the files to disk or into memory. A simple example: import boto def count_newlines(bucket_name, key_name): conn = boto.connect_s3() b = conn.get_bucket(bucket_name) # key is a .zip file key = b.get_key(key_name) count = 0 for chunk in

Compression in node.js

a 夏天 提交于 2019-12-11 02:13:22
问题 I'm putting some bigger JSON values in my caching layer (redis), and I think they could use some compression to cut down my memory usage a bit. Which compression modules for node.js do you use? For some reason everything that's listed on the joyent/node Modules wiki looks fishy - either 404s, no commits for more than a year, very few people watching, or open reports of memory leaks. Snappy looks nice, but I'd rather go for something more portable. I'd naturally prefer an async compression

What are the benefits of the different PHP compression libraries?

不羁岁月 提交于 2019-12-11 01:36:00
问题 I've been looking into ways to compress PHP libraries, and I've found several libraries which might be useful, but I really don't know much about them. I've specifically been reading about bcompiler and PHAR libraries. Is there any performance benefit in either of these? Are there any "gotchas" I need to watch out for? What are the relative benefits? Do either of them add to/detract from performance? I'm also interested in learning of other libs which might be out there which are not obvious

Possible to use CodeIgniter output compression with <pre> to display code blocks?

爱⌒轻易说出口 提交于 2019-12-11 00:59:47
问题 Is it possible to exclude <pre> tags from this code igniter compression hook? I don't understand regular expressions well enough to not break my page. I have tried, but it always jacks up the output. EDIT: This CodeIgniter Compression hook strips all unecisary white space and formatting from the code in order to compress the output. Including <pre> tags that rely on that spacing and formatting to display the code right. I'm trying to show code examples in a compressed output page. <?php if (

Multiple parts created while inserting in Hive table

萝らか妹 提交于 2019-12-10 22:49:52
问题 I have a hive table (with compression) with definition like create table temp1 (col1 string, col2 int) partitioned by (col3 string, col4 string) row format delimited fields terminated by ',' escaped by '\\' lines terminated by '\n' stored as sequencefile; When I do a simple select and insert (no reducers running) from another hive table to this table i see a unique pattern, data in this table with compression gets split in high no of files of very small size ( table 1 : at times 1gb data gets