gzipstream

GZipStream does not call underlying streams *Async methods when destination of CopyToAsync

大憨熊 提交于 2019-12-12 05:38:39
问题 Using the following construct for a GZipStream it never seems to call the *Async method of my custom stream when GZipStream is the destination of CopyToAsync . using (var fs = new System.IO.FileStream(@"C:\BTR\Source\Assemblies\BTR.Rbl.Evolution.Documents.dll", System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.None, 8192, true)) { using (var ss = new GZipStream(new MyCustomStream(), CompressionMode.Compress)) { await fs.CopyToAsync(ss); } } It seems to only call the

compressing pdf file makes it bigger

我的梦境 提交于 2019-12-12 03:55:56
问题 I compress the uploaded .pdf files and save them to server's file system. Everything works well, but the file gets bigger, from 30kb to 48kb. What could I be doing wrong? Here's the code part I compress the uploaded file: FileStream sourceFile = System.IO.File.OpenRead(filePath); FileStream destFile = System.IO.File.Create(zipPath); GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress); try { int theByte = sourceFile.ReadByte(); while (theByte != -1) { compStream

Compressing and decompressing a string yields only the first letter of the original string?

情到浓时终转凉″ 提交于 2019-12-12 02:16:24
问题 I'm compressing a string with Gzip using this code: public static String Compress(String decompressed) { byte[] data = Encoding.Unicode.GetBytes(decompressed); using (var input = new MemoryStream(data)) using (var output = new MemoryStream()) { using (var gzip = new GZipStream(output, CompressionMode.Compress, true)) { input.CopyTo(gzip); } return Convert.ToBase64String(output.ToArray()); } } and decompressing it with this code: public static String Decompress(String compressed) { byte[] data

Concatenate gzipped byte arrays in C#

情到浓时终转凉″ 提交于 2019-12-11 13:56:23
问题 I have gzipped data which is stored in DB. Is there a way to concatenate say 50 separate gzipped data into one gzipped output which can be uncompressed? The result should be same as decompressing that 50 items, concatenating them and then gzipping them. I would like to avoid decompression phase. Is there also some performance benefit of merging already gzipped data instead gzipping whole byte array? 回答1: Yes, you can concatenate gzip streams, which when decompressed give you the same thing as

Can GZip compression (via .net) increase file size?

隐身守侯 提交于 2019-12-10 19:28:46
问题 I keep track of the original size of the files that I'm compressing using .Net's GZipStream class, and it seems like the file that I thought I was compressing has increased in size. Is that possible? This is how I'm doing the compression: Byte[] bytes = GetFileBytes(file); using (FileStream fileStream = new FileStream("Zipped.gz", FileMode.Create)) { using (GZipStream zipStream = new GZipStream(fileStream, CompressionMode.Compress)) { zipStream.Write(bytes, 0, bytes.Length); } } 回答1: Yes, it

Is there a problem with IO.Compression?

放肆的年华 提交于 2019-12-10 18:33:54
问题 I've just started compressing file in VB.Net, using the following code. Since I'm targeting Fx 2.0, I can't use the Stream.CopyTo method. My code, however, gives extremely poor results compared to the gzip Normal compression profile in 7-zip. For example, my code compressed a 630MB outlook archive to 740MB, and 7-zip makes it 490MB. Here is the code. Is there a blatant mistake (or many?) Using Input As New IO.FileStream(SourceFile, IO.FileMode.Open, IO.FileAccess.Read, IO.FileShare.Read)

GZipStream decompression performance is poor

拥有回忆 提交于 2019-12-09 03:19:52
问题 I have a .NET 2.0 WinForms app that connects to a backend WAS server. I am using GZipStream to decode data coming back from a HttpWebRequest call made to the server. The data returned is compressed CSV, which Apache is compressing. The entire server stack is Hibernate-->EJB-->Spring-->Apache. For small responses, the performance is fine (<50ms). When I get a response >150KB, it takes more than 60 seconds to decompress. The majority of the time seems to be spent in the GZipStream constructor.

GZipStream not reading the whole file

跟風遠走 提交于 2019-12-08 19:14:13
问题 I have some code that downloads gzipped files, and decompresses them. The problem is, I can't get it to decompress the whole file, it only reads the first 4096 bytes and then about 500 more. Byte[] buffer = new Byte[4096]; int count = 0; FileStream fileInput = new FileStream("input.gzip", FileMode.Open, FileAccess.Read, FileShare.Read); FileStream fileOutput = new FileStream("output.dat", FileMode.Create, FileAccess.Write, FileShare.None); GZipStream gzipStream = new GZipStream(fileInput,

GZipStream: why do we convert to base 64 after compression?

£可爱£侵袭症+ 提交于 2019-12-07 15:55:23
问题 I was just looking at a code sample for compressing a string. I find that using the GZipStream class suffices. But I don't understand why we have to convert it to base 64 string as shown in the example. using System.IO.Compression; using System.Text; using System.IO; public static string Compress(string text) { byte[] buffer = Encoding.UTF8.GetBytes(text); MemoryStream ms = new MemoryStream(); using (GZipStream zip = new GZipStream(ms, CompressionMode.Compress, true)) { zip.Write(buffer, 0,

Resteasy generally enable GZIP

為{幸葍}努か 提交于 2019-12-07 05:31:18
问题 I have a RestEasy + Java EE application. When I add @GZIP to a component class, the server-answer is gzipped, if the client sends "accepts:gzip" Is there a way to generally enable gzip for all components? I don't like to add the annotation to every class. I'm using RestEasy JAX-RS 3.0.1 回答1: No, there is no way with annotations to enable gzip for all resources. If you wanted to forego adding the annotation to every class you could create a servlet filter that looks at the incoming headers and