compression

An efficient compression algorithm for short text strings [closed]

不想你离开。 提交于 2019-11-26 11:09:41
I'm searching for an algorithm to compress small text strings: 50-1000 bytes (i.e. URLs). Which algorithm works best for this? stvchu Check out Smaz : Smaz is a simple compression library suitable for compressing very short strings. Huffman has a static cost, the Huffman table, so I disagree it's a good choice. There are adaptative versions which do away with this, but the compression rate may suffer. Actually, the question you should ask is "what algorithm to compress text strings with these characteristics". For instance, if long repetitions are expected, simple Run-Lengh Encoding might be

Creating a ZIP Archive in Memory Using System.IO.Compression

让人想犯罪 __ 提交于 2019-11-26 11:04:33
I'm trying to create a ZIP archive with a simple demo text file using a MemoryStream as follows: using (var memoryStream = new MemoryStream()) using (var archive = new ZipArchive(memoryStream , ZipArchiveMode.Create)) { var demoFile = archive.CreateEntry("foo.txt"); using (var entryStream = demoFile.Open()) using (var streamWriter = new StreamWriter(entryStream)) { streamWriter.Write("Bar!"); } using (var fileStream = new FileStream(@"C:\Temp\test.zip", FileMode.Create)) { stream.CopyTo(fileStream); } } If I run this code, the archive file itself is created but foo.txt isn't. However, if I

Why use deflate instead of gzip for text files served by Apache?

别等时光非礼了梦想. 提交于 2019-11-26 11:00:47
What advantages do either method offer for html, css and javascript files served by a LAMP server. Are there better alternatives? The server provides information to a map application using Json, so a high volume of small files. See also Is there any performance hit involved in choosing gzip over deflate for http compression? Sam Saffron Why use deflate instead of gzip for text files served by Apache? The simple answer is don't . RFC 2616 defines deflate as: deflate The "zlib" format defined in RFC 1950 in combination with the "deflate" compression mechanism described in RFC 1951 The zlib

Video compression on android using new MediaCodec Library

拜拜、爱过 提交于 2019-11-26 10:33:54
问题 In my app I\'m trying to upload some videos that the user picked from gallery. The problem is that usually the android video files are too big to upload and so- we want to compress them first by lower bitrate/ resolution. I\'ve just heard about the new MediaCodec api that introduce with API 16 (I perviously tried to do so with ffmpeg). What I\'m doing right now is the following: First decode the input video using a video decoder, and configure it with the format that was read from the input

Node.js: Gzip compression?

倖福魔咒の 提交于 2019-11-26 10:21:06
问题 Am I wrong in finding that Node.js does no gzip compression and there are no modules out there to perform gzip compression? How can anyone use a web server that has no compression? What am I missing here? Should I try to—gasp—port the algorithm to JavaScript for server-side use? 回答1: Node v0.6.x has a stable zlib module in core now - there are some examples on how to use it server-side in the docs too. An example (taken from the docs): // server example // Running a gzip operation on every

Steganography in lossy compression (JAVA)

不想你离开。 提交于 2019-11-26 09:58:15
问题 I have this for encoding data in jpeg images in java. I am converting the text to its binary form and insert it to the LSB (depending on what the user has chosen.1,2,3,4) of the RGB in each pixel from (0,0) till (width,height). outer: for(int i = 0; i < height; i++){ for(int j = 0; j < width; j++){ Color c = new Color(image.getRGB(j, i)); int red = binaryToInteger(insertMessage(integerToBinary((int)(c.getRed())),numLSB)); int green = binaryToInteger(insertMessage(integerToBinary((int)(c

How can I skip compressing one PNG?

房东的猫 提交于 2019-11-26 09:45:47
问题 (Note: I have solved this problem, but it took long enough that I\'m posting question/answer here.) The Xcode build process \"optimizes\" my PNGs when building. This isn\'t usually a problem, but iTunesArtwork being processed in this way causes corrupts it so that iTunes not to be able to show it. How can I prevent this? 回答1: You can read more about Xcode's PNG compression here: http://iphonedevelopment.blogspot.com/2008/10/iphone-optimized-pngs.html While you can turn off PNG optimization

Hadoop: compress file in HDFS?

跟風遠走 提交于 2019-11-26 09:38:02
问题 I recently set up LZO compression in Hadoop. What is the easiest way to compress a file in HDFS? I want to compress a file and then delete the original. Should I create a MR job with an IdentityMapper and an IdentityReducer that uses LZO compression? 回答1: I suggest you write a MapReduce job that, as you say, just uses the Identity mapper. While you are at it, you should consider writing the data out to sequence files to improve performance loading. You can also store sequence files in block

How do you unzip very large files in python?

半世苍凉 提交于 2019-11-26 09:33:59
问题 Using python 2.4 and the built-in ZipFile library, I cannot read very large zip files (greater than 1 or 2 GB) because it wants to store the entire contents of the uncompressed file in memory. Is there another way to do this (either with a third-party library or some other hack), or must I \"shell out\" and unzip it that way (which isn\'t as cross-platform, obviously). 回答1: Here's an outline of decompression of large files. import zipfile import zlib import os src = open( doc, "rb" ) zf =

Fast Concatenation of Multiple GZip Files

偶尔善良 提交于 2019-11-26 09:19:10
问题 I have list of gzip files: file1.gz file2.gz file3.gz Is there a way to concatenate or gzipping these files into one gzip file without having to decompress them? In practice we will use this in a web database (CGI). Where the web will receive a query from user and list out all the files based on the query and present them in a batch file back to the user. 回答1: With gzip files, you can simply concatenate the files together, like so: cat file1.gz file2.gz file3.gz > allfiles.gz Per the gzip RFC