compression

How do I list contents of a gz file without extracting it in python?

﹥>﹥吖頭↗ 提交于 2019-12-04 03:35:19
问题 I have a .gz file and I need to get the name of files inside it using python. This question is the same as this one The only difference is that my file is .gz not .tar.gz so the tarfile library did not help me here I am using requests library to request a URL. The response is a compressed file. Here is the code I am using to download the file response = requests.get(line.rstrip(), stream=True) if response.status_code == 200: with open(str(base_output_dir)+"/"+str(current_dir)+"/"+str(count)+"

AngularJS compress $http post data

馋奶兔 提交于 2019-12-04 03:29:47
问题 I'm creating an Ionic app that needs to send big amounts of data to a server written in php. I'm looking for a way to compress the data I post to speed up my app. I'm not sure of what's the best approach, I tried LZString but the compressToEncodedURIComponent return value size is too big for my needs, I then tried using pako but I still wasn't satisfied with the compression rate. Which is the best way to compress the data I post to the server? Should I compress it separately(with one of the

What Is The Best Python Zip Module To Handle Large Files?

不想你离开。 提交于 2019-12-04 03:28:14
EDIT: Specifically compression and extraction speeds. Any Suggestions? Thanks So I made a random-ish large zipfile: $ ls -l *zip -rw-r--r-- 1 aleax 5000 115749854 Nov 18 19:16 large.zip $ unzip -l large.zip | wc 23396 93633 2254735 i.e., 116 MB with 23.4K files in it, and timed things: $ time unzip -d /tmp large.zip >/dev/null real 0m14.702s user 0m2.586s sys 0m5.408s this is the system-supplied commandline unzip binary -- no doubt as finely-tuned and optimized as a pure C executable can be. Then (after cleaning up /tmp;-)...: $ time py26 -c'from zipfile import ZipFile; z=ZipFile("large.zip");

Fast video compression like Whatsapp

折月煮酒 提交于 2019-12-04 03:24:08
问题 I need to speed up video compression in my Android app. I'm using FFMPEG and it takes 3 minutes to compress 80MB video. Does anyone knows a better solution? The command I'm using is: /data/data/com.moymer/app_bin/ffmpeg -y -i /storage/emulated/0/DCIM/Camera/VID_20150803_164811363.mp4 -s 640x352 -r 25 -vcodec mpeg4 -ac 1 -preset ultrafast -strict -2 /storage/emulated/0/DCIM/Camera/compressed_video.mp4 I'm running this command using FFMPEG for Android from this github repo: https://github.com

Best way to compress HTML, CSS & JS with mod_deflate and mod_gzip disabled

拟墨画扇 提交于 2019-12-04 03:18:08
I have a few sites on a shared host that is running Apache 2. I would like to compress the HTML, CSS and Javascript that is delivered to the browser. The host has disabled mod_deflate and mod_gzip, so these options are out. I do have PHP 5 at my disposal, though, so I could use the gzip component of that. I am currently placing the following in my .htaccess file: php_value output_handler ob_gzhandler However, this only compresses the HTML and leaves out the CSS and JS. Is there a reliable way of transparently compressing the output of the CSS and JS without having to change every page? I have

Python/Pandas create zip file from csv

拈花ヽ惹草 提交于 2019-12-04 02:53:37
Is anyone can provide example how to create zip file from csv file using Python/Pandas package? Thank you Use df.to_csv('my_file.gz', compression='gzip') From the docs: compression : string, optional a string representing the compression to use in the output file, allowed values are ‘gzip’, ‘bz2’, ‘xz’, only used when the first argument is a filename See discussion of support of zip files here . In response to Stefan's answer, add '.csv.gz' for the zip csv file to work df.to_csv('my_file.csv.gz', compression='gzip') Hope that helps 来源: https://stackoverflow.com/questions/37754165/python-pandas

Creating a gzip stream from separately compressed chunks

▼魔方 西西 提交于 2019-12-04 02:16:05
问题 I like to be able to generate a gzip (.gz) file using concurrent CPU threads. I.e., I would be deflating separate chunks from the input file with separately initialized z_stream records. The resulting file should be readable by zlib's inflate() function in a classic single threaded operation. Is that possible? Even if it requires customized zlib code? The only requirement would be that the currently existing zlib's inflate code could handle it. Update The pigz source code demonstrates how it

Optimized order of HTML attributes for compression

家住魔仙堡 提交于 2019-12-04 02:01:55
I read somewhere that organizing HTML attributes in a certain order can improve the rate of compression for the HTML document. (I think I read this from Google or Yahoo recommendation for faster sites). If I recall correctly, the recommendation was to put the most common attributes first (e.g. id , etc.) then put the rest in alphabetical order. I'm a bit confused by this. For example, if id attributes were put right after every p tag, the id would contain unique values. Thus, the duplicated string would be limited to this: <p id=" (say there were <p id="1"> and <p id="2"/> ). Because the value

How to find uncompressed size of ionic zip file

喜夏-厌秋 提交于 2019-12-04 01:46:51
问题 I have a zip file compressed using Ionic zip. Before extracting I need to verify the available disk space. But how do I find the uncompressed size before hand? Is there any header information in the zip file (by ionic) so that I can read it? 回答1: This should do the trick: static long totaluncompressedsize; static string info; . . . . // Option 1 foreach (ZipEntry e in zip) { long uncompressedsize = e.UncompressedSize; totaluncompressedsize += uncompressedsize; } // Or // Option 2 - will need

Compress data in php and uncompress in javascript [closed]

最后都变了- 提交于 2019-12-04 01:40:47
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 5 years ago . Greerings all Is there a way to compress data sent from php (server) and then uncompress the data using javascript (client)? Thanking you 回答1: I have to agree with @Domenic's answer here. @Nishchay Sharma is way off. The only thing I'll add is if you want to do this on a per