compression

how to compress a PNG image using Java

早过忘川 提交于 2019-11-27 07:45:23
问题 Hi I would like to know if there is any way in Java to reduce the size of an image (use any kind of compression) that was loaded as a BufferedImage and is going to be saved as an PNG. Maybe some sort of png imagewriteparam? I didnt find anything helpful so im stuck. heres a sample how the image is loaded and saved public static BufferedImage load(String imageUrl) { Image image = new ImageIcon(imageUrl).getImage(); bufferedImage = new BufferedImage(image.getWidth(null), image.getHeight(null),

DeflateStream 4GB Limit in .NET

自作多情 提交于 2019-11-27 06:55:51
问题 From MSDN: DeflateStream Class DeflateStream cannot be used to compress files larger than 4 GB. Are there any other implementations for .NET without the 4 GB limit? NOTE: I really need to decompress a file in GZ format with content larger than 4 GB. Can any code do that? 回答1: Take a look at SharpZipLib. Not sure if it's subject to the same limitation, but worth a look. 回答2: FYI, we have removed the 4 GB limit from DeflateStream in .NET 4. 回答3: There is sample code at CodeProject using the 7

How can I determine the length (i.e. duration) of a .wav file in C#?

只谈情不闲聊 提交于 2019-11-27 06:54:27
In the uncompressed situation I know I need to read the wav header, pull out the number of channels, bits, and sample rate and work it out from there: (channels) * (bits) * (samples/s) * (seconds) = (filesize) Is there a simpler way - a free library, or something in the .net framework perhaps? How would I do this if the .wav file is compressed (with the mpeg codec for example)? Jan Zich You may consider using the mciSendString(...) function (error checking is omitted for clarity): using System; using System.Text; using System.Runtime.InteropServices; namespace Sound { public static class

java.util.zip.ZipException: invalid distance too far back while decompressing

拈花ヽ惹草 提交于 2019-11-27 06:48:06
问题 util.zip.ZipException: invalid distance too far back this exception when i am decompressing my data....it occurs in this line zipInput = new GZIPInputStream(fis); bis = new BufferedInputStream(zipInput); bis.read(buffer);//here exception occurs please help. 回答1: This archieve really has been corrupted. You can form input stream from bytes: InputStream bStream = new ByteArrayInputStream(bytes); or from file: InputStream bStream = new FileInputStream(fis); ByteArrayOutputStream bOutStream = new

Mac OS X 'compress' option vs command line zip (why do they produce different results?)

回眸只為那壹抹淺笑 提交于 2019-11-27 06:45:50
问题 I noticed that the command line 'zip' tool and Mac OS X's 'Compress XXX' option (available via right click in finder) are giving different output files. Not only is the size of the file a few hundred bytes bigger but the content is significantly different as well. How can I find out what command the Finder is using for compression? 回答1: Take a look at An AppleScript to compress a Finder selection article. try tell application "Finder" set theSelection to the selection set selectionCount to

How many times can a file be compressed?

孤街醉人 提交于 2019-11-27 06:36:01
I was thinking about compression, and it seems like there would have to be some sort of limit to the compression that could be applied to it, otherwise it'd be a single byte. So my question is, how many times can I compress a file before: It does not get any smaller? The file becomes corrupt? Are these two points the same or different? Where does the point of diminishing returns appear? How can these points be found? I'm not talking about any specific algorithm or particular file, just in general. Nosredna For lossless compression, the only way you can know how many times you can gain by

Fast Concatenation of Multiple GZip Files

核能气质少年 提交于 2019-11-27 06:28:40
I have list of gzip files: file1.gz file2.gz file3.gz Is there a way to concatenate or gzipping these files into one gzip file without having to decompress them? In practice we will use this in a web database (CGI). Where the web will receive a query from user and list out all the files based on the query and present them in a batch file back to the user. With gzip files, you can simply concatenate the files together. Per the gzip RFC , A gzip file consists of a series of "members" (compressed data sets). [...] The members simply appear one after another in the file, with no additional

IOS Video Compression Swift iOS 8 corrupt video file

爷,独闯天下 提交于 2019-11-27 06:25:44
I am trying to compress video taken with the users camera from UIImagePickerController (Not an existing video but one on the fly) to upload to my server and take a small amount of time to do so, so a smaller size is ideal instead of 30-45 mb on newer quality cameras. Here is the code to do a compression in swift for iOS 8 and it compresses wonderfully, i go from 35 mb down to 2.1 mb easily. func convertVideo(inputUrl: NSURL, outputURL: NSURL) { //setup video writer var videoAsset = AVURLAsset(URL: inputUrl, options: nil) as AVAsset var videoTrack = videoAsset.tracksWithMediaType

Which compression method to use in PHP?

独自空忆成欢 提交于 2019-11-27 06:24:33
I have a large amount of data to move using two PHP scripts: one on the client side using a command line PHP script and other behind Apache. I POST the data to the server side and use php://input stream to save it on the web-server end. To prevent from reaching any memory limits, data is separated into 500kB chunks for each POST request. All this works fine. Now, to save the bandwidth and speed it up, I want to compress the data before sending and decompress when received on the other end. I found 3 pairs of functions that can do the job, but I cannot decide which one to use: gzencode /

Best Compression algorithm for a sequence of integers

穿精又带淫゛_ 提交于 2019-11-27 06:11:21
I have a large array with a range of integers that are mostly continuous, eg 1-100, 110-160, etc. All integers are positive. What would be the best algorithm to compress this? I tried the deflate algorithm but that gives me only 50% compression. Note that the algorithm cannot be lossy. All numbers are unique and progressively increasing. Also if you can point me to the java implementation of such algorithm that would be great. We have written recent research papers that survey the best schemes for this problem. Please see: Daniel Lemire and Leonid Boytsov, Decoding billions of integers per