large-files

AS3 Working With Arbitrarily Large Files

吃可爱长大的小学妹 提交于 2019-12-05 02:24:11
问题 I am trying to read a very large file in AS3 and am having problems with the runtime just crashing on me. I'm currently using a FileStream to open the file asynchronously. This does not work(crashes without an Exception) for files bigger than about 300MB. _fileStream = new FileStream(); _fileStream.addEventListener(IOErrorEvent.IO_ERROR, loadError); _fileStream.addEventListener(Event.COMPLETE, loadComplete); _fileStream.openAsync(myFile, FileMode.READ); In looking at the documentation, it

Good and effective CSV/TSV Reader for Java

旧巷老猫 提交于 2019-12-05 01:36:11
I am trying to read big CSV and TSV (tab-separated) Files with about 1000000 rows or more. Now I tried to read a TSV containing ~2500000 lines with opencsv , but it throws me an java.lang.NullPointerException . It works with smaller TSV Files with ~250000 lines. So I was wondering if there are any other Libraries that support the reading of huge CSV and TSV Files. Do you have any ideas? Everybody who is interested in my Code (I shorten it, so Try-Catch is obviously invalid): InputStreamReader in = null; CSVReader reader = null; try { in = this.replaceBackSlashes(); reader = new CSVReader(in,

How to quickly zip large files in PHP

妖精的绣舞 提交于 2019-12-04 21:06:07
I wrote a PHP script to dynamically pack files selected by the client into zip file and force a download. It works well except that when the number of files is huge (like over 50000), it takes a very long time for the download dialog box to appear on the client side. I thought about improving this using cache (these files are not changed very often), but because the selection of the files are totally decided by the user, and there are tens of thousands of combinations on the selection, it is very hard to cache combinations. I also thought about generating zip archives for individual files

Efficient way to aggregate and remove duplicates from very large (password) lists

 ̄綄美尐妖づ 提交于 2019-12-04 20:56:41
Context: I am attempting to combine a large amount of separate password list text files into a single file for use in dictionary based password cracking. Each text file is line delimited (a single password per line) and there are 82 separate files at the moment. Most (66) files are in the 1-100Mb filesize range, 12 are 100-700Mb, 3 are 2Gb, and 1 (the most problematic) is 11.2Gb. In total I estimate 1.75 billion non-unique passwords need processing; of these I estimate ~450 million (%25) will be duplicates and ultimately need to be discarded. I am attempting to do this on a device which has a

Dealing with large files in Haskell

五迷三道 提交于 2019-12-04 19:31:37
问题 I have a large file (4+ gigs) of, lets just say, 4 byte floats. I would like to treat it as List, in the sense that I would like to be able to use map, filter, foldl, etc. However, instead of producing a new list with the output, I would like to write the output back into the file, and thus only have to load a small portion of the file in memory. You could say I what a type called MutableFileList Has anyone ran into this situation before? Instead of re-inventing the wheel I was wondering if

Large file transfer over java socket [duplicate]

喜你入骨 提交于 2019-12-04 14:57:23
This question already has answers here : Java multiple file transfer over socket (2 answers) Closed last year . I have written a small client-server code for transferring small file. It uses Data output stream and readFully() method of data input stream. This code does not work for larger files for obvious reasons. I was thinking of fragmenting large files into smaller chunks of 1Kb each before sending them to client. But I can't think of any solution (like how to write multiple chunks on data output stream with correct offset and how to reassemble them at receiving end. Can anyone provide a

Zooming and loading very large TIFF file

依然范特西╮ 提交于 2019-12-04 13:33:55
问题 I have a very large hi-res map which I want to use in an application (imagesize is around 80 mb). I would like to know the following: How can I load this image the best way possible? I know it will take some seconds to load the image (which is ok) but I would like to notify the user of the progress. I would like to use a determined mode and show this in some sort of JProgressBar to the user. This should reflect the number of bytes that have been loaded or something like that. Is there any

c handle large file

笑着哭i 提交于 2019-12-04 13:15:09
问题 I need to parse a file that could be many gbs in size. I would like to do this in C. Can anyone suggest any methods to accomplish this? The file that I need to open and parse is a hard drive dump that I get from my mac's hard drive. However, I plan on running my program inside of 64-bit Ubuntu 10.04. Also given the large file size, the more optimized the method the better. 回答1: On both *nix and Windows, there are extensions to the I/O routines that touch file size that will support sizes

Can someone provide an example of seeking, reading, and writing a >4GB file using boost iostreams

為{幸葍}努か 提交于 2019-12-04 11:46:37
问题 I have read that boost iostreams supposedly supports 64 bit access to large files semi-portable way. Their FAQ mentions 64 bit offset functions, but there is no examples on how to use them. Has anyone used this library for handling large files? A simple example of opening two files, seeking to their middles, and copying one to the other would be very helpful. Thanks. 回答1: Short answer Just include #include <boost/iostreams/seek.hpp> and use the seek function as in boost::iostreams::seek

C# serialize large array to disk

爷,独闯天下 提交于 2019-12-04 11:39:44
I have a very large graph stored in a single dimensional array (about 1.1 GB) which I am able to store in memory on my machine which is running Windows XP with 2GB of ram and 2GB of virtual memory. I am able to generate the entire data set in memory, however when I try to serialize it to disk using the BinaryFormatter , the file size gets to about 50MB and then gives me an out of memory exception. The code I am using to write this is the same I use amongst all of my smaller problems: StateInformation[] diskReady = GenerateStateGraph(); BinaryFormatter bf = new BinaryFormatter(); using (Stream