large-files

How to detect X-Accel-Redirect (Nginx) / X-Sendfile (Apache) support in PHP?

纵饮孤独 提交于 2019-11-29 02:48:44
问题 About Application I am working on an e-commerce application in PHP. To keep URL's secure, product download links are kept behind PHP. There is a file, say download.php, which accepts few parameter via GET and verifies them against a database. If all goes well, it serves file using readfile() function in PHP. About Problem Now problem comes when file to be passed to readfile() is larger than memory limit set in php.ini As this application will be used by many users on shared-hosting we cannot

Java: InputStream too slow to read huge files

为君一笑 提交于 2019-11-29 01:36:12
I have to read a 53 MB file character by character. When I do it in C++ using ifstream, it is completed in milliseconds but using Java InputStream it takes several minutes. Is it normal for Java to be this slow or am I missing something? Also, I need to complete the program in Java (it uses servlets from which I have to call the functions which process these characters). I was thinking maybe writing the file processing part in C or C++ and then using Java Native Interface to interface these functions with my Java programs... How is this idea? Can anyone give me any other tip... I seriously

Random access gzip stream

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-28 23:36:07
I'd like to be able to do random access into a gzipped file. I can afford to do some preprocessing on it (say, build some kind of index), provided that the result of the preprocessing is much smaller than the file itself. Any advice? My thoughts were: Hack on an existing gzip implementation and serialize its decompressor state every, say, 1 megabyte of compressed data. Then to do random access, deserialize the decompressor state and read from the megabyte boundary. This seems hard, especially since I'm working with Java and I couldn't find a pure-java gzip implementation :( Re-compress the

Streaming large images using ASP.Net Webapi

℡╲_俬逩灬. 提交于 2019-11-28 23:24:33
We are trying to return large image files using ASP.Net WebApi and using the following code to stream the bytes to the client. public class RetrieveAssetController : ApiController { // GET api/retrieveasset/5 public HttpResponseMessage GetAsset(int id) { HttpResponseMessage httpResponseMessage = new HttpResponseMessage(); string filePath = "SomeImageFile.jpg"; MemoryStream memoryStream = new MemoryStream(); FileStream file = new FileStream(filePath, FileMode.Open, FileAccess.Read); byte[] bytes = new byte[file.Length]; file.Read(bytes, 0, (int)file.Length); memoryStream.Write(bytes, 0, (int

Computing MD5SUM of large files in C#

六眼飞鱼酱① 提交于 2019-11-28 21:34:55
I am using following code to compute MD5SUM of a file - byte[] b = System.IO.File.ReadAllBytes(file); string sum = BitConverter.ToString(new MD5CryptoServiceProvider().ComputeHash(b)); This works fine normally, but if I encounter a large file (~1GB) - e.g. an iso image or a DVD VOB file - I get an Out of Memory exception. Though, I am able to compute the MD5SUM in cygwin for the same file in about 10secs. Please suggest how can I get this to work for big files in my program. Thanks I suggest using the alternate method: MD5CryptoServiceProvider.ComputeHash(Stream) and just pass in an input

Fast Search to see if a String Exists in Large Files with Delphi

折月煮酒 提交于 2019-11-28 20:58:01
I have a FindFile routine in my program which will list files, but if the "Containing Text" field is filled in, then it should only list files containing that text. If the "Containing Text" field is entered, then I search each file found for the text. My current method of doing that is: var FileContents: TStringlist; begin FileContents.LoadFromFile(Filepath); if Pos(TextToFind, FileContents.Text) = 0 then Found := false else Found := true; The above code is simple, and it generally works okay. But it has two problems: It fails for very large files (e.g. 300 MB) I feel it could be faster. It

Advice on handling large data volumes

时光怂恿深爱的人放手 提交于 2019-11-28 19:55:46
So I have a "large" number of "very large" ASCII files of numerical data (gigabytes altogether), and my program will need to process the entirety of it sequentially at least once. Any advice on storing/loading the data? I've thought of converting the files to binary to make them smaller and for faster loading. Should I load everything into memory all at once? If not, is opening what's a good way of loading the data partially? What are some Java-relevant efficiency tips? Stu Thompson So then what if the processing requires jumping around in the data for multiple files and multiple buffers? Is

Reading Huge File in Python

大憨熊 提交于 2019-11-28 18:54:24
I have a 384MB text file with 50 million lines. Each line contains 2 space-separated integers: a key and a value. The file is sorted by key. I need an efficient way of looking up the values of a list of about 200 keys in Python. My current approach is included below. It takes 30 seconds. There must be more efficient Python foo to get this down to a reasonable efficiency of a couple of seconds at most. # list contains a sorted list of the keys we need to lookup # there is a sentinel at the end of list to simplify the code # we use pointer to iterate through the list of keys for line in fin:

Read lines by number from a large file

狂风中的少年 提交于 2019-11-28 17:16:24
I have a file with 15 million lines (will not fit in memory). I also have a small vector of line numbers - the lines that I want to extract. How can I read-out the lines in one pass? I was hoping for a C function that does it on one pass. The trick is to use connection AND open it before read.table : con<-file('filename') open(con) read.table(con,skip=5,nrow=1) #6-th line read.table(con,skip=20,nrow=1) #27-th line ... close(con) You may also try scan , it is faster and gives more control. Ari B. Friedman If it's a binary file Some discussion is here: Reading in only part of a Stata .DTA file

How to read large text file on windows? [closed]

好久不见. 提交于 2019-11-28 16:28:07
I have a large server log file (~750 MB) which I can't open with either Notepad or Notepad++ (they both say the file is too large). Can anyone suggest a program (for Windows) that will only read a small part of the file into memory at a time? Or do I need to write my own app to parse this file? Daniel Silveira try this... Large Text File Viewer By the way, it is free :) But, I think you should ask this on serverfault.com instead If all you need is a tool for reading, then this thing will open the file instantly http://www.readfileonline.com/ use EmEditor , it's pretty good, i used it to open a