compression

When compressing and encrypting, should I compress first, or encrypt first?

与世无争的帅哥 提交于 2019-11-28 16:29:20
If I were to AES-encrypt a file, and then ZLIB-compress it, would the compression be less efficient than if I first compressed and then encrypted? In other words, should I compress first or encrypt first, or does it matter? Compress first. Once you encrypt the file you will generate a stream of random data, which will be not be compressible. The compression process depends on finding compressible patterns in the data. maxbublis Compression before encryption is surely more space efficient but in the same time less secure. That's why I would disagree with other answers. Most compression

Python - mechanism to identify compressed file type and uncompress

孤人 提交于 2019-11-28 16:24:29
A compressed file can be classified into below logical groups a. The operating system which you are working on (*ix, Win) etc. b. Different types of compression algorithm (i.e .zip,.Z,.bz2,.rar,.gzip). Atleast from a standard list of mostly used compressed files. c. Then we have tar ball mechanism - where I suppose there are no compression. But it acts more like a concatenation. Now, if we start addressing the above set of compressed files, a. Option (a) would be taken care by python since it is platform independent language. b. Option (b) and (c) seems to have a problem. What do I need How do

How does Git save space and is fast at the same time?

我的梦境 提交于 2019-11-28 16:05:47
I just saw the first Git tutorial at http://blip.tv/play/Aeu2CAI . How does Git store all the versions of all the files, and how can it still be more economical in space than Subversion which saves only the latest version of the code? I know this can be done using compression, but that would be at the cost of speed, but this also says that Git is much faster (though where it gains the maximum is the fact that most of its operations are offline). So, my guess is that Git compresses data extensively It is still faster because uncompression + work is still faster than network_fetch + work Am I

How does one make a Zip bomb?

為{幸葍}努か 提交于 2019-11-28 15:18:46
This question about zip bombs naturally led me to the Wikipedia page on the topic. The article mentions an example of a 45.1 kb zip file that decompresses to 1.3 exabytes. What are the principles/techniques that would be used to create such a file in the first place? I don't want to actually do this, more interested in a simplified "how-stuff-works" explanation of the concepts involved. p.s. The article mentions 9 layers of zip files, so it's not a simple case of zipping a bunch of zeros. Why 9, why 10 files in each? Citing from the Wikipedia page: One example of a Zip bomb is the file 45.1

Python script for minifying CSS? [closed]

不羁岁月 提交于 2019-11-28 15:15:10
I'm looking for a simple Python script that can minify CSS as part of a web-site deployment process. (Python is the only scripting language supported on the server and full-blown parsers like CSS Utils are overkill for this project). Basically I'd like jsmin.py for CSS. A single script with no dependencies. Any ideas? Borgar This seemed like a good task for me to get into python, which has been pending for a while. I hereby present my first ever python script: import sys, re with open( sys.argv[1] , 'r' ) as f: css = f.read() # remove comments - this will break a lot of hacks :-P css = re.sub(

Split files using tar, gz, zip, or bzip2 [closed]

最后都变了- 提交于 2019-11-28 14:56:02
I need to compress a large file of about 17-20 GB. I need to split it into several files of around 1GB per file. I searched for a solution via Google and found ways using split and cat commands. But they did not work for large files at all. Also, they won't work in Windows; I need to extract it on a Windows machine. matpie You can use the split command with the -b option: split -b 1024m file.tar.gz It can be reassembled on a Windows machine using @ Joshua 's answer. copy /b file1 + file2 + file3 + file4 filetogether Edit : As @Charlie stated in the comment below, you might want to set a prefix

Read a .Z file (unix compresses file) in Java

僤鯓⒐⒋嵵緔 提交于 2019-11-28 14:22:26
The said file extension is explained here at http://kb.iu.edu/data/abck.html . I want to use a java api to read the contents of a Z file. Neither the ZipFile api or the GZIPInputStream seem to work. I can use the ZipFile api to open normal .zip files. ZipFile zf = new ZipFile("CR93H2.Z"); Enumeration entries = zf.entries(); To add, the said .Z file opens up fine in winrar. Does anyone know about the solution to it. Thanks You can use compress-j2me : % svn checkout --quiet http://compress-j2me.googlecode.com/svn/trunk/ compj2me % cd compj2me/src/lzc-test % ant -q % cd build/cmd % echo

How to minify jquery files?

巧了我就是萌 提交于 2019-11-28 13:35:47
I am using jquery and I got a couple plugins that don't offer a minified version. So I want to take the full version and minfiy it but all the sites I have found that you input your javascript and it minifies it breaks the plugin. Like it must strip something out because I get a syntax error. So anyone got a good one that I can use? If you're familiar with Java you could also use YUI compressor to minify JS (and CSS) files yourself. We use it here as well and it works great. Javascript Compressor Rater I believe it runs the js through rhino and outputs any errors found before hand, and after

DeflateStream 4GB Limit in .NET

馋奶兔 提交于 2019-11-28 12:21:35
From MSDN: DeflateStream Class DeflateStream cannot be used to compress files larger than 4 GB. Are there any other implementations for .NET without the 4 GB limit? NOTE: I really need to decompress a file in GZ format with content larger than 4 GB. Can any code do that? Take a look at SharpZipLib . Not sure if it's subject to the same limitation, but worth a look. FYI, we have removed the 4 GB limit from DeflateStream in .NET 4. benPearce There is sample code at CodeProject using the 7-Zip library. The license is open, so you should be able to use this in your project. 7-Zip also supports GZ

Compress jpeg on server with PHP

本秂侑毒 提交于 2019-11-28 12:08:35
I have a site with about 1500 JPEG images, and I want to compress them all. Going through the directories is not a problem, but I cannot seem to find a function that compresses a JPEG that is already on the server (I don't want to upload a new one), and replaces the old one. Does PHP have a built in function for this? If not, how do I read the JPEG from the folder into the script? Thanks. you're not telling if you're using GD, so i assume this. $img = imagecreatefromjpeg("myimage.jpg"); // load the image-to-be-saved // 50 is quality; change from 0 (worst quality,smaller file) - 100 (best