Delphi: Alternative to using Reset/ReadLn for text file reading

后端 未结 7 1206
时光取名叫无心
时光取名叫无心 2020-12-14 11:10

i want to process a text file line by line. In the olden days i loaded the file into a StringList:

slFile := TStringList.Create();
slFile.LoadFr         


        
相关标签:
7条回答
  • 2020-12-14 11:46

    Why not simply read the lines of the file directly from the TFileStream itself one at a time ?

    i.e. (in pseudocode):

      readline: 
        while NOT EOF and (readchar <> EOL) do
          appendchar to result
    
    
      while NOT EOF do
      begin
        s := readline
        process s
      end;
    

    One problem you may find with this is that iirc TFileStream is not buffered so performance over a large file is going to be sub-optimal. However, there are a number of solutions to the problem of non-buffered streams, including this one, that you may wish to investigate if this approach solves your initial problem.

    0 讨论(0)
  • 2020-12-14 11:47

    With recent Delphi versions, you can use TStreamReader. Construct it with your file stream, and then call its ReadLine method (inherited from TTextReader).

    An option for all Delphi versions is to use Peter Below's StreamIO unit, which gives you AssignStream. It works just like AssignFile, but for streams instead of file names. Once you've used that function to associate a stream with a TextFile variable, you can call ReadLn and the other I/O functions on it just like any other file.

    0 讨论(0)
  • 2020-12-14 11:48

    As it seems the FileMode variable is not valid for Textfiles, but my tests showed that multiple reading from the file is no problem. You didn't mention it in your question, but if you are not going to write to the textfile while it is read you should be good.

    0 讨论(0)
  • 2020-12-14 11:49

    If you need support for ansi and Unicode in older Delphis, you can use my GpTextFile or GpTextStream.

    0 讨论(0)
  • 2020-12-14 11:49

    What I do is use a TFileStream but I buffer the input into fairly large blocks (e.g. a few megabytes each) and read and process one block at a time. That way I don't have to load the whole file at once.

    It works quite quickly that way, even for large files.

    I do have a progress indicator. As I load each block, I increment it by the fraction of the file that has additionally been loaded.

    Reading one line at a time, without something to do your buffering, is simply too slow for large files.

    0 讨论(0)
  • 2020-12-14 11:51

    I had same problem a few years ago especially the problem of locking the file. What I did was use the low level readfile from the shellapi. I know the question is old since my answer (2 years) but perhaps my contribution could help someone in the future.

    const
      BUFF_SIZE = $8000;
    var
      dwread:LongWord;
      hFile: THandle;
      datafile : array [0..BUFF_SIZE-1] of char;
    
    hFile := createfile(PChar(filename)), GENERIC_READ, FILE_SHARE_READ or FILE_SHARE_WRITE, nil, OPEN_EXISTING, FILE_ATTRIBUTE_READONLY, 0);
    SetFilePointer(hFile, 0, nil, FILE_BEGIN);
    myEOF := false;
    try
      Readfile(hFile, datafile, BUFF_SIZE, dwread, nil);   
      while (dwread > 0) and (not myEOF) do
      begin
        if dwread = BUFF_SIZE then
        begin
          apos := LastDelimiter(#10#13, datafile);
          if apos = BUFF_SIZE then inc(apos);
          SetFilePointer(hFile, aPos-BUFF_SIZE, nil, FILE_CURRENT);
        end
        else myEOF := true;
        Readfile(hFile, datafile, BUFF_SIZE, dwread, nil);
      end;
    finally
       closehandle(hFile);
    end;
    

    For me the speed improvement appeared to be significant.

    0 讨论(0)
提交回复
热议问题