Fastest way to incrementally read a large file

后端 未结 2 1825
灰色年华
灰色年华 2020-12-13 20:13

When given a buffer of MAX_BUFFER_SIZE, and a file that far exceeds it, how can one:

  1. Read the file in blocks of MAX_BUFFER_SIZE?
  2. Do it as fast as poss
2条回答
  •  生来不讨喜
    2020-12-13 21:01

    Assuming that you need to read the entire file into memory at once (as you're currently doing), neither reading smaller chunks nor NIO are going to help you here.

    In fact, you'd probably be best reading larger chunks - which your regular IO code is automatically doing for you.

    Your NIO code is currently slower, because you're only reading one byte at a time (using buffer.get();).

    If you want to process in chunks - for example, transferring between streams - here is a standard way of doing it without NIO:

    InputStream is = ...;
    OutputStream os = ...;
    
    byte buffer[] = new byte[1024];
    int read;
    while((read = is.read(buffer)) != -1){
        os.write(buffer, 0, read);
    }
    

    This uses a buffer size of only 1 KB, but can transfer an unlimited amount of data.

    (If you extend your answer with details of what you're actually looking to do at a functional level, I could further improve this to a better answer.)

提交回复
热议问题