What is the best buffer size when using BinaryReader to read big files (>1 GB)?

后端 未结 2 611
梦谈多话
梦谈多话 2020-12-05 16:26

I\'m reading binary files and here is a sample:

public static byte[] ReadFully(Stream input)
{
    byte[] buffer = new byte[16*1024];
    int read;
    while         


        
2条回答
  •  情书的邮戳
    2020-12-05 16:38

    "Sequential File Programming Patterns and Performance with .NET" is a great article in I/O performance improvement.

    In page 8 of this PDF file, it shows that the bandwidth for buffer size bigger than eight bytes, is constant. Consider that the article has been written in 2004 and the hard disk drive is "Maxtor 250 GB 7200 RPM SATA disk" and the result should be different by latest I/O technologies.

    If you are looking for the best performance take a look at pinvoke.net or the page 9 of the PDF file, the un-buffered file performance measurements shows better results:

    In un-buffered I/O, the disk data moves directly between the application’s address space and the device without any intermediate copying.

    Summary

    • For single disks, use the defaults of the .NET framework – they deliver excellent performance for sequential file access.
    • Pre-allocate large sequential files (using the SetLength() method) when the file is created. This typically improves speed by about 13% when compared to a fragmented file.
    • At least for now, disk arrays require un-buffered I/O to achieve the highest performance - buffered I/O can be eight times slower than un-buffered I/O. We expect this problem will be addressed in later releases of the .NET framework.
    • If you do your own buffering, use large request sizes (64 KB is a good place to start). Using the .NET framework, a single processor can read and write a disk array at over 800 Mbytes/s using un-buffered I/O.

提交回复
热议问题