compression

What is the canonical method for an HTTP client to instruct an HTTP server to disable gzip responses?

守給你的承諾、 提交于 2019-11-27 17:31:01
问题 I thought this was a simple google search, but apparently I'm wrong on that. I've seen that you should supply: Accept-Encoding: gzip;q=0,deflate;q=0 in the request headers. However, the article that suggested it also noted that proxies routinely ignore that header. Also, when I supplied it to nginx, it still compressed the response message body. http://forgetmenotes.blogspot.ca/2009/05/how-to-disable-gzip-compression-in.html So, how do I tell a web server to disable compression on the

How to enable GZIP compression in IIS 7.5

旧时模样 提交于 2019-11-27 17:21:01
I want to compress my files using GZIP. Can you share the web.config code for compressing files with GZIP? Is there anything more that I have to do after uploading my web.config file? Ryan GZip Compression can be enabled directly through IIS. First, open up IIS, go to the website you are hoping to tweak and hit the Compression page. If Gzip is not installed, you will see something like the following: “The dynamic content compression module is not installed.” We should fix this. So we go to the “Turn Windows features on or off” and select “Dynamic Content Compression” and click the OK button.

Error Deflate And Inflate With zLib

邮差的信 提交于 2019-11-27 17:11:55
问题 I'm trying to compile the zpipe.c example in my Linux(Ubuntu 8.04) with gcc, but I'm getting some errors, take a look: [ubuntu@eeepc:~/Desktop] gcc zpipe.c /tmp/ccczEQxz.o: In function `def': zpipe.c:(.text+0x65): undefined reference to `deflateInit_' zpipe.c:(.text+0xd3): undefined reference to `deflateEnd' zpipe.c:(.text+0x150): undefined reference to `deflate' zpipe.c:(.text+0x1e8): undefined reference to `deflateEnd' zpipe.c:(.text+0x27b): undefined reference to `deflateEnd' /tmp/ccczEQxz

Reducing video size with same format and reducing frame size

白昼怎懂夜的黑 提交于 2019-11-27 17:04:41
This question might be very basic Is there a way to reduce the frame size/rate of Lossy compressed (WMV, MPEG) format, to get a smaller video, of lesser size, with same format. Are there any open source or proprietary apis for this? Jason B ffmpeg provides this functionality. All you need to do is run someting like ffmpeg -i <inputfilename> -s 640x480 -b 512k -vcodec mpeg1video -acodec copy <outputfilename> For newer versions of ffmpeg you need to change -b to -b:v : ffmpeg -i <inputfilename> -s 640x480 -b:v 512k -vcodec mpeg1video -acodec copy <outputfilename> to convert the input video file

How can I tell if my server is serving GZipped content?

偶尔善良 提交于 2019-11-27 16:51:00
I have a webapp on a NGinx server. I set gzip on in the conf file and now I'm trying to see if it works. YSlow says it's not, but 5 out of 6 websites that do the test say it is. How can I get a definite answer on this and why is there a difference in the results? It looks like one possible answer is, unsurprisingly, curl : $ curl http://example.com/ --silent --write-out "%{size_download}\n" --output /dev/null 31032 $ curl http://example.com/ --silent -H "Accept-Encoding: gzip,deflate" --write-out "%{size_download}\n" --output /dev/null 2553 In the second case the client tells the server that

Using Java Deflater/Inflater with custom dictionary causes IllegalArgumentException

爱⌒轻易说出口 提交于 2019-11-27 16:43:59
问题 The following code is based on the example given in the javadocs for java.util.zip.Deflater. The only changes I have made is to create a byte array called dict and then set the dictionary on both the Deflater and Inflater instances using the setDictionary(byte[]) method. The problem I'm seeing is that when I call Inflater.setDictionary() with the exact same array as I used for the Deflater, I get an IllegalArgumentException. Here is the code in question: import java.util.zip.Deflater; import

pdftk compression option

谁都会走 提交于 2019-11-27 16:42:16
I use pdftk to compress a pdf using the following command line pdftk file1.pdf output file2.pdf compress It works as the weight of my file decreased. Are there [options] to change the compression??? Or maybe other solutions to compress my file? It is heavy because some graphics have a lot of points . Is there a way to convert these graphs to jpg for instance and adapt the compression? nullglob I had the same problem and found two different solutions (see this thread for more details). Both reduced the size of my uncompressed PDF dramatically. Pixelated (lossy): convert input.pdf -compress Zip

HTTP request compression

情到浓时终转凉″ 提交于 2019-11-27 16:15:45
General Use-Case Imagine a client that is uploading large amounts of JSON. The Content-Type should remain application/json because that describes the actual data. Accept-Encoding and Transfer-Encoding seem to be for telling the server how it should format the response. It appears that responses use the Content-Encoding header explicitly for this purpose, but it is not a valid request header. Is there something I am missing? Has anyone found an elegant solution? Specific Use-Case My use-case is that I have a mobile app that is generating large amounts of JSON (and some binary data in some cases

Possible to compress SQL Server network traffic?

泪湿孤枕 提交于 2019-11-27 15:57:27
问题 I have a .NET client that needs to connect to a remote SQL Server over the WAN, is it possible to compress SQL traffic between the client and the server? I am using .NET 3.5 and SQL Server 2005 and greater. 回答1: Looking at the connectionstrings.com here for SQL Server 2008, the database providers do not have some kind of compression scheme...You may need to write a wrapper on a different port, that compresses the data, by using the front end, send the data across that port, from there,

Zip support in Apache Spark

走远了吗. 提交于 2019-11-27 14:48:13
I have read about Spark 's support for gzip -kind input files here , and I wonder if the same support exists for different kind of compressed files, such as .zip files. So far I have tried computing a file compressed under a zip file, but Spark seems unable to read its contents successfully. I have taken a look to Hadoop 's newAPIHadoopFile and newAPIHadoopRDD , but so far I have not been able to get anything working. In addition, Spark supports creating a partition for every file under a specified folder, like in the example below: SparkConf SpkCnf = new SparkConf().setAppName("SparkApp")