large-files

HttpClient throws OutOfMemory exception when TransferEncodingChunked is not set

怎甘沉沦 提交于 2020-01-02 07:57:14
问题 In order to support upload of large (actually very large, up to several gigabytes) files with progress report we started using HttpClient with PushStreamContent, as described here. It works straightforward, we copy bytes between two streams, here is a code example: private void PushContent(Stream src, Stream dest, int length) { const int bufferLength = 1024*1024*10; var buffer = new byte[bufferLength]; var pos = 0; while (pos < length) { var bytes = Math.Min(bufferLength, length - pos); src

How to copy a large file in Windows XP?

我只是一个虾纸丫 提交于 2020-01-02 07:35:09
问题 I have a large file in windows XP - its 38GB. (a VM image) I cannot seem to copy it. Dragging on the desktop - gives error of "Insufficient system resources exist to complete the requested service" Using Java - FileChannel.transferTo(0, fileSize, dest) fails for all files > 2GB Using Java - FileChannel.transferTo() in chunks of 100Mb fails after ~18Gb java.io.IOException: Insufficient system resources exist to complete the requested service at sun.nio.ch.FileDispatcher.write0(Native Method)

How to quickly zip large files in PHP

空扰寡人 提交于 2020-01-01 22:00:28
问题 I wrote a PHP script to dynamically pack files selected by the client into zip file and force a download. It works well except that when the number of files is huge (like over 50000), it takes a very long time for the download dialog box to appear on the client side. I thought about improving this using cache (these files are not changed very often), but because the selection of the files are totally decided by the user, and there are tens of thousands of combinations on the selection, it is

How to quickly zip large files in PHP

百般思念 提交于 2020-01-01 22:00:08
问题 I wrote a PHP script to dynamically pack files selected by the client into zip file and force a download. It works well except that when the number of files is huge (like over 50000), it takes a very long time for the download dialog box to appear on the client side. I thought about improving this using cache (these files are not changed very often), but because the selection of the files are totally decided by the user, and there are tens of thousands of combinations on the selection, it is

How to parse files larger than 100GB in Python?

ぃ、小莉子 提交于 2020-01-01 18:24:45
问题 I have text files with about 100 Gb size with the below format (with duplicate records of line and ips and domains) : domain|ip yahoo.com|89.45.3.5 bbc.com|45.67.33.2 yahoo.com|89.45.3.5 myname.com|45.67.33.2 etc. I am trying to parse them using the following python code but I still get Memory error. Does anybody know a more optimal way of parsing such files? (Time is an important element for me) files = glob(path) for filename in files: print(filename) with open(filename) as f: for line in f

Large file support not working in C programming

妖精的绣舞 提交于 2019-12-30 11:30:41
问题 I'm trying to compile a shared object (that is eventually used in Python with ctypes). The command-line used to build the object is: gcc -Wall -O3 -shared -Wl,-soname,borg_stream -lm -m128bit-long-double -fPIC \ -D_FILE_OFFSET_BITS=64 -o borg_stream.so data_stream.c data_types.c \ file_operations.c float_half.c channels.c statistics.c index_stream.c helpers.c The library builds properly on a 32-bit OS and it does what it needs to for small files. However, it fails the unit tests for files

Handling very large images in Qt

送分小仙女□ 提交于 2019-12-30 10:41:19
问题 I can't get Qt to work on images beyond 10,000X10,000. I'm dealing with huge satellite images that are around 2GB each. I considered using memory mapping but the image still occupies space in memory. QFile file("c://qt//a.ras"); file.open(QIODevice::ReadOnly); qint64 size = file.size(); uchar *img=file.map(0,size); QImage I(img,w,h,QImage::Format_ARGB32); Can anyone tell me a more efficient way to deal with large images in Qt? 回答1: Qgraphicsview and a set of image tiles, the view handles all

On windows _fseeki64 does not seek to SEEK_END correctly for large files

╄→尐↘猪︶ㄣ 提交于 2019-12-30 07:35:51
问题 I have reduced the problem to the following basic function which should simply print the number of bytes in the file. When I execute it for a file of 83886080 bytes (80 MB) it prints the correct number. However for a file of 4815060992 bytes (4.48 GB) it prints 520093696 which is way to low. It seems to have something to do with the SEEK_END option because if I set the pointer to 4815060992 bytes manually (e.g. _fseeki64(fp, (__int64)4815060992, SEEK_SET) _ftelli64 does return the correct

On windows _fseeki64 does not seek to SEEK_END correctly for large files

拟墨画扇 提交于 2019-12-30 07:35:28
问题 I have reduced the problem to the following basic function which should simply print the number of bytes in the file. When I execute it for a file of 83886080 bytes (80 MB) it prints the correct number. However for a file of 4815060992 bytes (4.48 GB) it prints 520093696 which is way to low. It seems to have something to do with the SEEK_END option because if I set the pointer to 4815060992 bytes manually (e.g. _fseeki64(fp, (__int64)4815060992, SEEK_SET) _ftelli64 does return the correct

Getting Exception when trying to upload a big files size

五迷三道 提交于 2019-12-30 05:32:06
问题 I'm using wshttpbinding for my service <wsHttpBinding> <binding name="wsHttpBinding_Windows" maxBufferPoolSize="9223372036854775807" maxReceivedMessageSize="2147483647"> <readerQuotas maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxStringContentLength="2147483647" maxNameTableCharCount="2147483647"/> <security mode="Message"> <message clientCredentialType="Windows"/> </security> </binding> </wsHttpBinding> <behavior name="ServiceBehavior"> <dataContractSerializer