compression

Inno Setup - How to add multiple arc files to decompress?

落花浮王杯 提交于 2019-11-29 08:57:45
I am using this code: Inno Setup - How to add cancel button to decompressing page? (answer of Martin Prikryl) to decompress an arc file with Inno Setup. I want to have the possibility of decompress more than one arc file to install files from components selection (for example). But still show on overall progress bar for all extractions. whole Is this possible? This is modification of my answer to Inno Setup - How to add cancel button to decompressing page? Prerequisities are the same, refer to the other answer. In the ExtractArc , call AddArchive for each archive you want to extract. [Files]

Only decompress a specific bzip2 block

冷暖自知 提交于 2019-11-29 08:38:03
问题 Say I have a bzip2 file (over 5GB), and I want to decompress only block #x, because there is where my data is (block is different every time). How would I do this? I thought about making an index of where all the blocks are, then cut the block I need from the file and apply bzip2recover to it. I also thought about compressing say 1MB at a time, then appending this to a file (and recording the location), and simply grabbing the file when I need it, but I'd rather keep the original bzip2 file

Lossless hierarchical run length encoding

≯℡__Kan透↙ 提交于 2019-11-29 07:58:18
问题 I want to summarize rather than compress in a similar manner to run length encoding but in a nested sense. For instance, I want : ABCBCABCBCDEEF to become: (2A(2BC))D(2E)F I am not concerned that an option is picked between two identical possible nestings E.g. ABBABBABBABA could be (3ABB)ABA or A(3BBA)BA which are of the same compressed length, despite having different structures. However I do want the choice to be MOST greedy. For instance: ABCDABCDCDCDCD would pick (2ABCD)(3CD) - of length

Compress file before upload via http

北城余情 提交于 2019-11-29 07:15:24
Is it possible to compress data being sent from the client's browser (a file upload) to the server? Flash, silverlight and other technology is ok! Browsers never compress uploaded data because they have no way of knowing whether the server supports it. Downloaded content can be compressed because the Accept-Encoding request header allows the browser to indicate to the server that it supports compressed content. Unfortunately, there's no equivalent protocol that works the other way and allows the server to indicate to the browser that it supports compression. If you have control over the server

Can ClickOnce deployed application setup be compressed?

一个人想着一个人 提交于 2019-11-29 07:12:58
I publish Windows Forms application using ClickOnce. The installation is quite big considering the overall size of this app. It's something over 15 MB. If I compress locally built application it is squeezed to 2.5 MB. Can ClickOnce deployment be compressed somehow? If not, is anyone using IIS compression to speed up transfers? Would that help? As far as I know, you can't really manually compress your assemblies. However, you absolutely can use IIS compression. From my testing with a bandwidth monitor, it makes a significant difference. And once it's set up, you never have to think about it, it

Is this a bug in this gzip inflate method?

时光毁灭记忆、已成空白 提交于 2019-11-29 07:04:00
When searching on how to inflate gzip compressed data on iOS, the following method appears in number of results: - (NSData *)gzipInflate { if ([self length] == 0) return self; unsigned full_length = [self length]; unsigned half_length = [self length] / 2; NSMutableData *decompressed = [NSMutableData dataWithLength: full_length + half_length]; BOOL done = NO; int status; z_stream strm; strm.next_in = (Bytef *)[self bytes]; strm.avail_in = [self length]; strm.total_out = 0; strm.zalloc = Z_NULL; strm.zfree = Z_NULL; if (inflateInit2(&strm, (15+32)) != Z_OK) return nil; while (!done) { // Make

Minify HTML/PHP

笑着哭i 提交于 2019-11-29 07:02:35
I'm using gzip to compress my html/php files along with js/css/etc. This reduces the payload quite nicely but I also want to 'minify' my markup of both .html and .php pages. Ideally I'd like to control this from a .htaccess file (where I also do the gzipping) rather than the having to include php to each file. I'd like the output to be like that of http://google.com or http://www.w3-edge.com/wordpress-plugins/w3-total-cache/ and http://css-tricks.com (both produced by W3 Total Cache plugin for WordPress). Can anyone recommend a good way to do this. Looking at the examples, minifying the HTML

Is there a quality, file-size, or other benefit to JPEG sizes being multiples of 8px or 16px?

扶醉桌前 提交于 2019-11-29 06:55:41
The JPEG compression encoding process splits a given image into blocks of 8x8 pixels, working with these blocks in future lossy and lossless compressions. [source] It is also mentioned that if the image is a multiple 1MCU block (defined as a Minimum Coded Unit, 'usually 16 pixels in both directions') that lossless alterations to a JPEG can be performed. [source] I am working with product images and would like to know both if, and how much benefit can be derived from using multiples of 16 in my final image size (say, using an image with size 480px by 360px) vs. a non-multiple of 16 (such as

Encoding binary data within XML: Are there better alternatives than base64?

半城伤御伤魂 提交于 2019-11-29 06:48:53
I want to encode and decode binary data within an XML file (with Python, but whatever). I have to face the fact that an XML tag content has illegal characters. The only allowed ones are described in XML specs : Char ::= #x9 | #xA | #xD | [#x20-#xD7FF] | [#xE000-#xFFFD] | [#x10000-#x10FFFF] Which means that the unallowed are: 29 Unicode control characters are illegal (0x00 - 0x20) ie ( 000xxxxx ) except 0x09, 0x0A, 0x0D Any Unicode character representation above 2 bytes (UTF-16+) is illegal (U+D800 - U+DFFF) ie ( 11011xxx ) The special Unicode noncharacters are illegal (0xFFFE - 0xFFFF) ie (

Compress sorted integers

白昼怎懂夜的黑 提交于 2019-11-29 06:26:12
I'm building a index which is just several sets of ordered 32 bit integers stored continuously in a binary file. The problem is that this file grows pretty large. I've been thinking of adding some compressions scheme but that's a bit out of my expertise. So I'm wondering, what compression algorithm would work best in this case? Also, decompression has to be fast since this index will be used to make make look ups. If you are storing integers which are close together (eg: 1, 3 ,4, 5, 9, 10 etc... ) rather than some random 32 bit integers (982346..., 3487623412.., etc) you can do one thing: Find