compression

How to reduce the size of an sqlite3 database for iphone?

醉酒当歌 提交于 2019-11-30 09:40:27
edit: many thanks for all the answers. Here are the results after applying the optimisations so far: Switching to sorting the characters and run length encoding - new DB size 42M Dropping the indexes on the booleans - new DB size 33M The really nice part is this hasn't required any changes in the iphone code I have an iphone application with a large dictionary held in sqlite format (read only). I'm looking for ideas to reduce the size of the DB file, which is currently very large. Here is the number of entries and resulting size of the sqlite DB: franks-macbook:DictionaryMaker frank$ ls -lh

Compression Libraries For C++ [closed]

一世执手 提交于 2019-11-30 09:17:22
I was reading about compression in programs and I started to create a new simple project, a zipper (just a zipper, not an unzipper), but I only found zLib, and it's for C. I know that C libraries can be used in C++, but I like to use C++ libraries. Does anyone know a good one to suggest? Best Regards. Most compression libraries that I know of are written in C for two reasons: one, the general age of good compression algorithms; and two, the high portability (and stability) of C across platforms. I suggest any of the following. If you want good licenses select one of the top two, otherwise if

UIImage to raw NSData / avoid compression

牧云@^-^@ 提交于 2019-11-30 09:05:58
问题 I have my own image downloader class, it holds a queue and downloads images one (or a certain amount) at a time, writes them to the cache folder and retrieves them from the cache folder when necessary. I also have a UIImageView subclass to which I can pass a URL, through the image downloader class it will look if the image already exists on the device and show it if it does, or download and show it after it finished. After an image finishes downloading I do the following. I create a UIImage

PHP+Imagick - PNG Compression

心不动则不痛 提交于 2019-11-30 09:00:29
问题 How do I efficiently compress a PNG? In my case, the images are small grayscale images with transparency. Currently I'm playing with this: // ... $im->setImageFormat('png'); $im->setImageColorspace(\Imagick::COLORSPACE_GRAY); $im->setImageCompression(\Imagick::COMPRESSION_LZW); $im->setImageCompressionQuality(9); $im->stripImage(); $im->writeImage($url_t); As Imagick doesn't offer COMPRESSION_PNG , I've tried LZW but there's almost no change in the filesize (usually it's even bigger than

java: save string as gzip file

守給你的承諾、 提交于 2019-11-30 08:51:34
I'm java beginner, I need something like this : String2GzipFile (String file_content, String file_name) String2GzipFile("Lorem ipsum dolor sit amet, consectetur adipiscing elit.", "lorem.txt.gz") I cant figure out how to do that. Jon Skeet There are two orthogonal concepts here: Converting text to binary, typically through an OutputStreamWriter Compressing the binary data, e.g. using GZIPOutputStream So in the end you'll want to: Create an OutputStream which writes to wherever you want the result (e.g. a file or in memory via a ByteArrayOutputStream Wrap that OutputStream in a GZIPOutputStream

How to execute 7zip without blocking the InnoSetup UI?

此生再无相见时 提交于 2019-11-30 08:46:25
问题 My InnoSetup GUI is frozen during unzip operations. I've a procedure DoUnzip(source: String; targetdir: String) with the core unzipTool := ExpandConstant('{tmp}\7za.exe'); Exec(unzipTool, ' x "' + source + '" -o"' + targetdir + '" -y', '', SW_HIDE, ewWaitUntilTerminated, ReturnCode); This procedure is called multiple times and the Exec operation blocks the user interface. There is only a very short moment between the executions, where the Inno GUI is dragable/moveable. I know that there are

Minify HTML/PHP

[亡魂溺海] 提交于 2019-11-30 08:34:55
问题 I'm using gzip to compress my html/php files along with js/css/etc. This reduces the payload quite nicely but I also want to 'minify' my markup of both .html and .php pages. Ideally I'd like to control this from a .htaccess file (where I also do the gzipping) rather than the having to include php to each file. I'd like the output to be like that of http://google.com or http://www.w3-edge.com/wordpress-plugins/w3-total-cache/ and http://css-tricks.com (both produced by W3 Total Cache plugin

Dynamic Compression not working IIS 7.5

走远了吗. 提交于 2019-11-30 08:17:34
I currently have both static and dynamic compression configured. The static compression is working, however the dynamic compression, when checked through YSlow and Fiddler, is not working. In my applicationHost.config, I have the following settings: <urlCompression doStaticCompression="true" doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> <httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files" maxDiskSpaceUsage="100" minFileSizeForComp="256"> <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" dynamicCompressionLevel="1" />

How to read a .gz file line-by-line in C++?

谁说胖子不能爱 提交于 2019-11-30 08:12:59
I have 3 terabyte .gz file and want to read its uncompressed content line-by-line in a C++ program. As the file is quite huge, I want to avoid loading it completely in memory. Can anyone post a simple example of doing it? You most probably will have to use ZLib's deflate, example is available from their site Alternatively you may have a look at BOOST C++ wrapper The example from BOOST page (decompresses data from a file and writes it to standard output) #include <fstream> #include <iostream> #include <boost/iostreams/filtering_streambuf.hpp> #include <boost/iostreams/copy.hpp> #include <boost

Why can't hadoop split up a large text file and then compress the splits using gzip?

馋奶兔 提交于 2019-11-30 07:30:40
问题 I've recently been looking into hadoop and HDFS. When you load a file into HDFS, it will normally split the file into 64MB chunks and distribute these chunks around your cluster. Except it can't do this with gzip'd files because a gzip'd file can't be split. I completely understand why this is the case (I don't need anyone explaining why a gzip'd file can't be split up). But why couldn't HDFS take a plain text file as input and split it like normal, then compress each split using gzip