compression

Tar problem with apache commons compress

余生颓废 提交于 2019-12-08 06:02:07
问题 I'm having a hard time trying to tar some files using the compress library. My code is the following, and is taken from the commons.compress wiki exemples : private static File createTarFile(String[] filePaths, String saveAs) throws Exception{ File tarFile = new File(saveAs); OutputStream out = new FileOutputStream(tarFile); TarArchiveOutputStream aos = (TarArchiveOutputStream) new ArchiveStreamFactory().createArchiveOutputStream("tar", out); for(String filePath : filePaths){ File file = new

lightweight RESTful PHP server

ぐ巨炮叔叔 提交于 2019-12-08 05:55:27
问题 I want to write an extremely lightweight PHP server which handles data requests from remote clients. The data returned is tabular (like that of data read from a CSV file or a database table). The "problem" is that I could be returning potentially several hundred thousand rows of data - with a column width of between 10 - 15 (depending on the type of data requested). In short, the data returned could be HUGE - and in an attempt to save bandwidth, and also increase the speed of transmission, I

zip more than 1 file in GzipStream

橙三吉。 提交于 2019-12-08 05:37:46
问题 How can I zip more than 1 file in a GZipStream? I have 3 xml files and I want to zip them into one .gz file and when I decompress I should get all 3 separate files. How can I do this? 回答1: gzip is only designed to hold a single file. You will need to collate them in some other container before gzipping them. 回答2: You can tar them together first. SharpZipLib has support for tar, as well as its own GZip implementation. See the SharpZipLib.Tar namespace. The docs are here. 来源: https:/

Is repacking a repository useful for large binaries?

梦想的初衷 提交于 2019-12-08 05:36:12
问题 I'm trying to convert a large history from Perforce to Git, and one folder (now git branch) contains a significant number of large binary files. My problem is that I'm running out of memory while running git gc --aggressive . My primary question here is whether repacking the repository is likely to have any meaningful effect on large binaries. Compressing them another 20% would be great. 0.2% isn't worth my effort. If not, I'll have them skipped over as suggested here. For background, I

A Java library to compress (e.g. LZW) a string

回眸只為那壹抹淺笑 提交于 2019-12-08 05:23:10
问题 Apache Commons Compress works only with archive files (please correct me if I am wrong). I need something like MyDB.put(LibIAmLookingFor.compress("My long string to store")); String getBack = LibIAmLookingFor.decompress(MyDB.get())); And LZW is just an example, could be anything similar. Thank you. 回答1: You have a plethora of choices - You can use the java.util.Deflater for the Deflate algortihm, try { // Encode a String into bytes String inputString = "blahblahblah??"; byte[] input =

AesZipFileEncrypter zipAndEncrypt method adds all folder tree to file

谁都会走 提交于 2019-12-08 05:09:48
问题 I'm using this method to zip and decrypt a file: AesZipFileEncrypter.zipAndEncrypt This code: AesZipFileEncrypter.zipAndEncrypt(new File("C:\Test\Folder\MyFile.txt"), new File("C:\Test\Folder\MyZip.zip"), password, aesEncrypter); compresses also the folder tree of my file, not just the file. For example: Adding C:\Test\Folder\MyFile.txt in the created zip file I will find the folders C:\Test\Folder\MyFile.txt also if I would like to have just MyFile.txt in the root folder. Is it possibile?

How to compress a random string?

有些话、适合烂在心里 提交于 2019-12-08 04:50:18
问题 I'm working on a encryptor application that works based on RSA Asymmetric Algorithm. It generates a key-pair and the user have to keep it. As key-pairs are long random strings, I want to create a function that let me compress generated long random strings (key-pairs) based on a pattern. (For example the function get a string that contains 100 characters and return a string that contains 30 characters) So when the user enter the compressed string I can regenerate the key-pairs based on the

Compress dynamic content to ServletOutputStream

▼魔方 西西 提交于 2019-12-08 04:48:24
问题 I want to compress the dynamic created content and write to ServletOutputStream directly, without saving it as a file on the server before compressing. For example, I created an Excel Workbook and a StringBuffer that includes strings with the SQL template. I don't want to save the dynamic content to .xlsx and .sql file on the server before zipping the files and writing to ServletOutputStream for downloading. Sample Code: ServletOutputStream out = response.getOutputStream(); workbook.write

Python 3x- Compression Makes File Bigger :(

回眸只為那壹抹淺笑 提交于 2019-12-08 04:24:19
问题 Ok. Recently I was testing out a piece of code for a small project. It required me to compress some files, and it actually makes the file size bigger, unless there is a problem in what it prints. Here's my code: def Compress(z): #Line Spacing May Be Off A Little Because I'm New to Stack Overflow import zlib, sys, time, base64 text = open(z, "rb").read() print ("Raw Size:", sys.getsizeof(text)) compressed = zlib.compress(text, 9) print ("Compressed Size:", sys.getsizeof(compressed)) ratio =

Does the ExternalInterface, in Flex 3, have a data size limitation?

情到浓时终转凉″ 提交于 2019-12-08 03:59:41
问题 I am using the ExternalInterface on Flex 3. We are actually using flex to compress a large amount of DOM data, so this is specifically being used with LARGE data. To further investigate, if there is a limitation, is this universal? (IE. Silverlight) First, let me state that this is being done with an application that was made by inexperienced software engineers. This is an app that we need to buy time by compressing the data so that we can build a long-term solution. We have no other options,