compression

Static compression in IIS does not work for htm, js files

筅森魡賤 提交于 2019-12-01 18:01:15
I'm trying to configure IIS 7.5 to compress static htm and js files. Does anyone know why it does not work for me? Here is my web.config for the web site: <httpCompression> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </staticTypes> <

Why can't I use “CompactDatabase” in DAO.DBEngine.36 using VBscript?

隐身守侯 提交于 2019-12-01 17:47:23
I'm trying to make a small VBScript that compacts a MS Access 2007 database file. The code I have is: Set acc2007 = CreateObject("DAO.DBEngine.36") acc2007.CompactDatabase "C:\test.accdb", "C:\test2.accdb", Nothing, Nothing, ";pwd=test" Set acc2007 = Nothing I'm getting this error when I run the three lines with "cscript test.vbs" from a 32-bit cmd.exe: C:\test.vbs(10, 1) DAO.DbEngine: Unrecognized database format 'C:\test.accdb'. The database was created with MS Access 2007, when I open it by double-clicking the icon I type the password "test" and then i opens normally. It says "Access 2007"

Fast video compression like Whatsapp

…衆ロ難τιáo~ 提交于 2019-12-01 17:44:00
I need to speed up video compression in my Android app. I'm using FFMPEG and it takes 3 minutes to compress 80MB video. Does anyone knows a better solution? The command I'm using is: /data/data/com.moymer/app_bin/ffmpeg -y -i /storage/emulated/0/DCIM/Camera/VID_20150803_164811363.mp4 -s 640x352 -r 25 -vcodec mpeg4 -ac 1 -preset ultrafast -strict -2 /storage/emulated/0/DCIM/Camera/compressed_video.mp4 I'm running this command using FFMPEG for Android from this github repo: https://github.com/guardianproject/android-ffmpeg-java The code to use FFMPEG in my project is inside an AsyncTask and is

How to open and read LZMA file in-memory

最后都变了- 提交于 2019-12-01 17:39:34
I have a giant file, let's call it one-csv-file.xz . It is an XZ-compressed CSV file. How can I open and parse through the file without first decompressing it to disk? What if the file is, for example, 100 GB? Python cannot read all of that into memory at once, of course. Will it page or run out of memory? You can iterate through an LZMAFile object import lzma # python 3, try lzmaffi in python 2 with open('one-csv-file.xz') as compressed: with lzma.LZMAFile(compressed) as uncompressed: for line in uncompressed: do_stuff_with(line) You can decompress incrementally. See Compression using the

Is there a way to store gzip's dictionary from a file?

荒凉一梦 提交于 2019-12-01 17:32:30
I've been doing some research on compression-based text classification and I'm trying to figure out a way of storing a dictionary built by the encoder (on a training file) for use to run 'statically' on a test file? Is this at all possible using UNIX's gzip utility? For example I have been using 2 'class' files of sport.txt and atheism.txt, hence I want to run compression on both of these files and store their dictionaries used. Next I want to take a test file (which is unlabelled, could be either atheism or sport) and by using the prebuilt dictionaries on this test.txt I can analyse how well

Encoding issue with requesting JSON from StackOverflow API

梦想的初衷 提交于 2019-12-01 17:18:22
I can't figure this out for the life of me. Below is an implementation with the request module, but I've also tried with the node-XMLHttpRequest module to no avail. var request = require('request'); var url = 'http://api.stackexchange.com/2.1/questions?pagesize=100&fromdate=1356998400&todate=1359676800&order=desc&min=0&sort=votes&tagged=javascript&site=stackoverflow'; request.get({ url: url }, function(error, response, body) { if (error || response.statusCode !== 200) { console.log('There was a problem with the request'); return; } console.log(body); // outputs gibberish characters like �

Compressing and decompressing streams

≯℡__Kan透↙ 提交于 2019-12-01 16:58:00
I found this article about simple proxy server implemented in JAVA: http://www.java2s.com/Code/Java/Network-Protocol/Asimpleproxyserver.htm The code simply gets some stream from the client, after sends it to the server and after it gets stream from the server and sends the response to the client. What I would like to do is to compress this streams before it is sent and decompress after it is received. I found the class GZIPInputStream but I'm not sure how to use it and what I found on internet didn't help me. I either didn't understand that so much or it was not a good solution for me. My idea

Static compression in IIS does not work for htm, js files

南笙酒味 提交于 2019-12-01 16:54:57
问题 I'm trying to configure IIS 7.5 to compress static htm and js files. Does anyone know why it does not work for me? Here is my web.config for the web site: <httpCompression> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add

PHP: Is gzdeflate safe across multiple machines?

别说谁变了你拦得住时间么 提交于 2019-12-01 16:40:14
问题 In the PHP manual there is a comment on gzdeflate saying: gzcompress produces longer data because it embeds information about the encoding onto the string. If you are compressing data that will only ever be handled on one machine, then you don't need to worry about which of these functions you use. However, if you are passing data compressed with these functions to a different machine you should use gzcompress. and then running 50000 repetitions on various content, i found that gzdeflate()

PHP: Is gzdeflate safe across multiple machines?

被刻印的时光 ゝ 提交于 2019-12-01 16:38:06
In the PHP manual there is a comment on gzdeflate saying: gzcompress produces longer data because it embeds information about the encoding onto the string. If you are compressing data that will only ever be handled on one machine, then you don't need to worry about which of these functions you use. However, if you are passing data compressed with these functions to a different machine you should use gzcompress. and then running 50000 repetitions on various content, i found that gzdeflate() and gzcompress() both performed equally fast regardless content and compression level, but gzinflate()