zlib

Using zlib to create a gzip file using zpipe.c example

谁说我不能喝 提交于 2019-12-06 08:56:01
问题 All, I am just trying to get the zpipe demo working using dev-c++ where I have all the zlib code imported and using the zpipe.c as an example. Everything compiles and runs. If I try to create a gzip file using the commented out call to deflateInit2 it creates with now errors, but is corrupted when unzipping with 7zip. If I use the standard zlib headers to create the file, when I use the corresponding call to inflate it give me a return of -3/Z_DATA_ERROR indicating my defalte data is

How to determine compressed size from zlib for gzipped data?

妖精的绣舞 提交于 2019-12-06 06:35:12
问题 I'm using zlib to perform gzip compression. zlib writes the data directly to an open TCP socket after compressing it. /* socket_fd is a file descriptor for an open TCP socket */ gzFile gzf = gzdopen(socket_fd, "wb"); int uncompressed_bytes_consumed = gzwrite(gzf, buffer, 1024); (of course all error handling is removed) The question is: how do you determine how many bytes were written to the socket? All the gz* functions in zlib deal with byte counts/offsets in the uncompressed domain, and

How to gzip NSData with zlib?

只谈情不闲聊 提交于 2019-12-06 06:19:23
I want to use zlib because I'm assuming it's the best & fastest way to gzip NSData . But, is there a better way? If not, which version of the zlib library should I link to in Xcode: libz.dylib, libz.1.dylib, libz.1.1.3.dylib, or libz.1.2.5.dylib? Please provide a code example of how to us zlib to convert NSData *normalHTTPBody into NSData gzippedHTTPBody Yes, zlib is what is used to gzip data. I know of no better way. As for speed, you can select the compression level to optimize speed vs. compression for your application. You will likely find that libz.dylib and libz.1.dylib are symbolic

Compress large file in ruby with Zlib for gzip

寵の児 提交于 2019-12-06 05:31:23
问题 I have a very large file, approx. 200 million rows of data. I would like to compress it with the Zlib library, specifically using the Writer. Reading through each line one at at time seems like it would take quite a bit of time. Is there a better way to accomplish this? Here is what I have right now: require 'zlib' Zlib::GzipWriter.open('compressed_file.gz') do |gz| File.open(large_data_file).each do |line| gz.write line end gz.close end 回答1: You can use IO#read to read a chunk of arbitrary

import zlib ImportError: No module named zlib [duplicate]

社会主义新天地 提交于 2019-12-06 05:28:56
问题 This question already has answers here : no module named zlib (8 answers) Closed last year . # pythonbrew venv create django1.5 Creating `django1.5` environment into /usr/local/pythonbrew/venvs/Python-2.7.3 Traceback (most recent call last): File "/usr/local/pythonbrew/etc/virtualenv/virtualenv.py", line 19, in <module> import zlib ImportError: No module named zlib What should I do?? I want to import zlib. And I aready install zlib * # rpm -qa |grep zlib zlib-1.2.5-7.fc17.i686 zlib-devel-1.2

Bandwith Speed Boost Tips for PHP5 Servers: Output and Zlib Compression

旧街凉风 提交于 2019-12-06 04:16:08
问题 I have some detailed, specialist questions about the nature of the settings that go in htaccess when setting up PHP bandwith savings and the effective speed gain experienced : Allow me to thank you in advance for your answer and clarifications on this matter as I dont understand the encyclopedic style long-page apache manuals Example below is what is actually running on my Apache 2.0 and PHP 5.2.3 # preserve bandwidth for PHP enabled servers <ifmodule mod_php4.c> php_value zlib.output

golang/python zlib difference

纵饮孤独 提交于 2019-12-06 02:47:38
问题 Debugging differences between Python's zlib and golang's zlib. Why don't the following have the same results? compress.go : package main import ( "compress/flate" "bytes" "fmt" ) func compress(source string) []byte { w, _ := flate.NewWriter(nil, 7) buf := new(bytes.Buffer) w.Reset(buf) w.Write([]byte(source)) w.Close() return buf.Bytes() } func main() { example := "foo" compressed := compress(example) fmt.Println(compressed) } compress.py : from __future__ import print_function import zlib

Unzip buffer with Python?

佐手、 提交于 2019-12-06 00:46:11
问题 I have a buffer of bytes read from a library call and I would like to unzip the content which is a single text file. I tried with zlib , but I get this error: >>> import zlib >>> zlib.decompress(buffer) error: Error -3 while decompressing data: incorrect header check However with ZipFile it works, but I have to use a temporary file: import zipfile f = open('foo.zip', 'wb') f.write(buffer) f.close() z = ZipFile('foo.zip') z.extractall() z.close() with open('foo.txt', 'r') as f: uncompressed

Decompress PNG using zlib

落花浮王杯 提交于 2019-12-05 21:36:09
How can I use zlib library to decompress a PNG file? I need to read a PNG file using a C under gcc compiler. I've code once a basic Java library for reading/writing PNG files: http://code.google.com/p/pngj/ It does not support palleted images but apart from that [Updated: it supports all PNG variants now] it's fairly complete, simple and the code has no external dependencies (i.e. it only uses the standard JSE API, which includes zip decompression). And the code is available. I guess you could port it to C with not much effort. Why not use libpng ? The PNG file format is fairly simple, but

PNG: deflate and zlib

試著忘記壹切 提交于 2019-12-05 20:02:58
I'm trying to understand compression in PNG - but I seem to find a lot of contradictory information online ... I would like to understand - how is searching done in the LZ77-part: hash table with linked lists? is this defined in deflate? or implemented in zlib? is there a choice of the search method? - can PNG encoders/decoders set some parameters for the compression (strategy, filter, etc.) or is there a default for PNG? - does the LZ77-part do greedy or lazy evaluation? or is this an option too? - and finally: the 2 Huffman trees, are they compressed in a third tree, and all three of them