Fastest way to read every 30th byte of large binary file?

前端 未结 7 1774

What is the fastest way to read every 30th byte of a large binary file (2-3 GB)? I\'ve read there are performance problems with fseek because of I/O buffers, but I don\'t wa

7条回答
  •  长情又很酷
    2020-12-24 14:49

    If you are reading data from a hard disk with a spinning platter the answer is you read the whole file sequentially using a large buffer and discard the portions in memory you don't want.

    The smallest unit of access possible to a standard hard disk drive is the sector. Sector sizes for all common spinning disk drives are many times more than 30 bytes. This means the hard disk controller must access each and every sector anyway regardless of what the request from the host looks like. There is no low level magic possible to change this.

    Even if this was not the case and you could read individual bytes there is a huge premium for seek vs sequential read operations. The best possible case is still the same as sequential read. In the real world I wouldn't be surprised if signaling overhead would preclude such schemes from working even with a massive command buffer.

提交回复
热议问题