compression

java.lang.OutOfMemoryError: Direct buffer memory when invoking Files.readAllBytes

喜夏-厌秋 提交于 2019-12-25 07:29:03
问题 I've got the following code which is designed to read a directory and compress it into a tar.gz archive. When I deploy the code onto the server and test it with a batch of files, it works on the first few test batches, but after the 4th or 5th batch, it starts consistently giving me java.lang.OutOfMemoryError: Direct buffer memory even though the file batch size stays the same and the heap space looks fine. Here's the code : public static void compressDirectory(String

extract zlib file in iOS?

心不动则不痛 提交于 2019-12-25 06:37:37
问题 When downloading data from a server thats compressed using zlib, I was wondering what steps are needed to take to uncompress and save to core data any help would be greatly appreciated 回答1: Use something like ASIHTTP to retrieve the file off the network. That can handle gzip which is likely what the data coming off the network is compressed with. If you'd prefer to roll your own, take a look at this zlib category on NSData: Has worked well for me in the past and likely does what you'll need.

Compressing images in existing pdfs makes the resulting PDF file bigger (Lowagies resizing method and real compression method)

别说谁变了你拦得住时间么 提交于 2019-12-25 05:29:32
问题 Im having a problem with image compression. I used the answer described in this question compress pdf with large images via java if i set the FACTOR variable to 0.9f or 1f (original size) the resulting pdf file starts to get bigger than the ORIGINAL. But that is not the case for all files. Some files created by myself are getting smaller like planned but some just get bigger like +1/3rd and i get black backgrounds on some images ontop of it. this is getting even worse when im using the normal

Is it necessary to use multiple gzip members for input larger than 4GB?

浪尽此生 提交于 2019-12-25 05:27:13
问题 By stating Features: no 4GB limit ... Idzip just uses multiple gzip members to have no file size limit. the author of idzip seems to imply that multiple gzip members are necessary to support data > 4GB. But the deflate algorithm, whose output gzip members merely wrap with header and footer, evidently supports more than 4GB of input. So is it really necessary to use multiple gzip members to compress more than 4GB of data? 回答1: Even .net's GZipStream , which does not support multiple members

create and read tar.bz2 files in perl

梦想的初衷 提交于 2019-12-25 05:23:33
问题 I have been trying to use system("tar -jcvf archive_name.tar.bz2 $my_file") but I get an error archive_name.tar.bz2: Cannot open: No such file or directory tar: Error is not recoverable: exiting now tar: Child returned status 2 Is it not possible to create .tar.bz2 using this method in perl? I would prefer not to use a module but will if it is absolutely necessary 回答1: First of all, you're not doing it in Perl . You're spawning a separate process that runs the command. If you want to do this

create and read tar.bz2 files in perl

随声附和 提交于 2019-12-25 05:23:10
问题 I have been trying to use system("tar -jcvf archive_name.tar.bz2 $my_file") but I get an error archive_name.tar.bz2: Cannot open: No such file or directory tar: Error is not recoverable: exiting now tar: Child returned status 2 Is it not possible to create .tar.bz2 using this method in perl? I would prefer not to use a module but will if it is absolutely necessary 回答1: First of all, you're not doing it in Perl . You're spawning a separate process that runs the command. If you want to do this

Reduce file size with DotNetZip

ぃ、小莉子 提交于 2019-12-25 04:32:33
问题 I want to backup my database using SMO and zip a backup file then upload it to my server. here is my code // Backup var conn = new SqlConnection(sqlConnectionString); var server = new Server(new ServerConnection(conn)); var backupMgr = new Backup(); backupMgr.Devices.AddDevice(file.FullName, DeviceType.File); backupMgr.Database = dictionary[StringKeys.Database]; backupMgr.Action = BackupActionType.Database; backupMgr.SqlBackup(server); // Compress using (var zip = new ZipFile()) { zipFile =

CWAC Camera: why my SimpleCameraHost saveImage is so slow, am I doing something wrong?

末鹿安然 提交于 2019-12-25 03:55:41
问题 How to optimize this peace of code? It takes about a minute on saveImage method. class ObrolSimpleHost extends SimpleCameraHost { private final String[] SCAN_TYPES = {"image/webp"}; private Context context = null; public ObrolSimpleHost(Context _ctxt) { super(_ctxt); this.context = getActivity(); } @Override public void saveImage(PictureTransaction xact, Bitmap bitmap) { File photo = getPhotoPath(); if (photo.exists()) { photo.delete(); } try { FileOutputStream fos = new FileOutputStream

using gzip compression with webpack and express, still serving bundle.js instead of bundle.js.gz

一曲冷凌霜 提交于 2019-12-25 03:17:38
问题 I've just installed and setup gzip compression to my webpack and express files. The plugins snippet of my webpack.config.js now looks like this: plugins: [ new webpack.DefinePlugin({ // <-- key to reducing React's size 'process.env': { 'NODE_ENV': JSON.stringify('production') } }), new webpack.optimize.UglifyJsPlugin(), //minify everything new webpack.optimize.AggressiveMergingPlugin(),//Merge chunks new CompressionPlugin({ asset: "[path].gz[query]", algorithm: "gzip", test: /\.js$|\.css$|\

Compression Streams

∥☆過路亽.° 提交于 2019-12-25 02:45:00
问题 I've been trying to implement a compression method in one of my programs. I want it to take in a stream, compress it, and return the compressed stream (It returns a stream because I want to be able to pass the stream to another function without having to save it to a file and re-read it later). I had a working test version based on the msdn example for GZipStream, and this is what I came up with when I tried to convert it to taking in and returning streams: public static Stream compress