Fast & Efficient Way To Read Large JSON Files Line By Line in Java

给你一囗甜甜゛ 提交于 2019-12-01 08:29:05

You can use JSON Processing API (JSR 353), to process your data in a streaming fashion:

import javax.json.Json;
import javax.json.stream.JsonParser;

...

String dataPath = "data.json";

try(JsonParser parser = Json.createParser(new FileReader(dataPath))) {
     List<String> row = new ArrayList<>();

     while(parser.hasNext()) {
         JsonParser.Event event = parser.next();
         switch(event) {
             case START_ARRAY:
                 continue;
             case VALUE_STRING:
                 row.add(parser.getString());
                 break;
             case END_ARRAY:
                 if(!row.isEmpty()) {
                     //Do something with the current row of data 
                     System.out.println(row);

                     //Reset it (prepare for the new row) 
                     row.clear();
                 }
                 break;
             default:
                 throw new IllegalStateException("Unexpected JSON event: " + event);
         }
     }
}
iMysak

Please take a look on Jackson Streaming API,

I guess you are looking something like this - https://www.ngdata.com/parsing-a-large-json-file-efficiently-and-easily/

and this - https://stackoverflow.com/a/24838392/814304

Main thing - if you have a big file you need to read and process file lazy, piece by piece.

You can use JsonSurfer to extract all inner JSON array by a JsonPath: $[*]

    JsonSurfer surfer = JsonSurferJackson.INSTANCE;
    surfer.configBuilder().bind("$[*]", new JsonPathListener() {
        @Override
        public void onValue(Object value, ParsingContext context) {
            System.out.println(value);
        }
    }).buildAndSurf(json);

It won't load entire Json into memory. JSON array will be processed one by one.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!