large-files

Best Free Text Editor Supporting *More Than* 4GB Files? [closed]

依然范特西╮ 提交于 2019-11-27 04:56:57
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . Locked . This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions. I am looking for a text editor that will be able to load a 4+ Gigabyte file into it. Textpad doesn't work. I own a copy of it and have

Upload large files in .NET

时光毁灭记忆、已成空白 提交于 2019-11-27 04:17:13
I've done a good bit of research to find an upload component for .NET that I can use to upload large files, has a progress bar, and can resume the upload of large files. I've come across some components like AjaxUploader , SlickUpload , and PowUpload , to name a few. Each of these options cost money and only PowUpload does the resumable upload, but it does it with a java applet. I'm willing to pay for a component that does those things well, but if I could write it myself that would be best. I have two questions: Is it possible to resume a file upload on the client without using flash/java

Reading large excel file with PHP

故事扮演 提交于 2019-11-27 03:35:20
问题 I'm trying to read a 17MB excel file (2003) with PHPExcel1.7.3c, but it crushes already while loading the file, after exceeding the 120 seconds limit I have. Is there another library that can do it more efficiently? I have no need in styling, I only need it to support UTF8. Thanks for your help 回答1: Filesize isn't a good measure when using PHPExcel, it's more important to get some idea of the number of cells (rowsxcolumns) in each worksheet. If you have no need for styling, are you calling:

How to scan through really huge files on disk?

怎甘沉沦 提交于 2019-11-27 02:40:45
问题 Considering a really huge file(maybe more than 4GB) on disk,I want to scan through this file and calculate the times of a specific binary pattern occurs. My thought is: Use memory-mapped file(CreateFileMap or boost mapped_file) to load the file to the virtual memory. For each 100MB mapped-memory,create one thread to scan and calculate the result. Is this feasible?Are there any better method to do so? Update : Memory-mapped file would be a good choice,for scaning through a 1.6GB file could be

Downloading a Large File - iPhone SDK

我是研究僧i 提交于 2019-11-27 02:32:39
I am using Erica Sadun's method of Asynchronous Downloads (link here for the project file: download ), however her method does not work with files that have a big size (50 mb or above). If I try to download a file above 50 mb, it will usually crash due to a memory crash. Is there anyway I can tweak this code so that it works with large files as well? Here is the code I have in the DownloadHelper Classes (which is already in the download link): .h @protocol DownloadHelperDelegate <NSObject> @optional - (void) didReceiveData: (NSData *) theData; - (void) didReceiveFilename: (NSString *) aName; -

Writing large files with Node.js

核能气质少年 提交于 2019-11-27 00:58:32
问题 I'm writing a large file with node.js using a writable stream: var fs = require('fs'); var stream = fs.createWriteStream('someFile.txt', { flags : 'w' }); var lines; while (lines = getLines()) { for (var i = 0; i < lines.length; i++) { stream.write( lines[i] ); } } I'm wondering if this scheme is safe without using drain event? If it is not (which I think is the case), what is the pattern for writing an arbitrary large data to a file? 回答1: That's how I finally did it. The idea behind is to

Very large uploads with PHP

时光总嘲笑我的痴心妄想 提交于 2019-11-27 00:09:33
I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however. Browser: HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all Flash uploader puts entire file into memory before starting the upload Server: PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally. The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like

Large file upload though html form (more than 2 GB)

青春壹個敷衍的年華 提交于 2019-11-26 23:56:43
Is there anyway to upload a file more than 2 GB, using simple html form upload? Previously I have been uploading large files through silverlight using chunking (dividing a large file into segments and then uploading segments one by one & then reassemble segments at server). Now, we have a requirement that we just have to use simple html (though GWT) form uploads. Please guide me if there is any way to achieve large file upload this way. If it is impossible to do it using simple html, can anyone guide me about how I can divide & upload a file in segments using flex? The limitation of the size

python: read lines from compressed text files

不问归期 提交于 2019-11-26 22:26:27
Is it easy to read a line from a gz-compressed text file using python without extracting the file completely? I have a text.gz file which is aroud 200mb. When I extract it, it becomes 7.4gb. And this is not the only file I have to read. For the total process, I have to read 10 files. Although this will be a sequential job, I think it will a smart thing to do it without extarcting the whole information. I do not even know that it is possible. How can it be done using python? I need to read a text file line-by-line. Have you tried using gzip.GzipFile ? Arguments are similar to open . Using gzip

How to avoid OutOfMemoryError when uploading a large file using Jersey client

 ̄綄美尐妖づ 提交于 2019-11-26 22:12:21
I am using Jersey client for http-based request. It works well if the file is small but run into error when I post a file with size of 700M: Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2786) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94) at sun.net.www.http.PosterOutputStream.write(PosterOutputStream.java:61) at com.sun.jersey.api.client.CommittingOutputStream.write(CommittingOutputStream.java:90) at com.sun.jersey.core.util.ReaderWriter.writeTo(ReaderWriter.java:115) at com.sun.jersey.core.provider