is there any way to import a json file(contains 100 documents) in elasticsearch server.?

前端 未结 9 1702
孤街浪徒
孤街浪徒 2020-12-04 10:39

Is there any way to import a JSON file (contains 100 documents) in elasticsearch server? I want to import a big json file into es-server..

9条回答
  •  遥遥无期
    2020-12-04 11:02

    As dadoonet already mentioned, the bulk API is probably the way to go. To transform your file for the bulk protocol, you can use jq.

    Assuming the file contains just the documents itself:

    $ echo '{"foo":"bar"}{"baz":"qux"}' | 
    jq -c '
    { index: { _index: "myindex", _type: "mytype" } },
    . '
    
    {"index":{"_index":"myindex","_type":"mytype"}}
    {"foo":"bar"}
    {"index":{"_index":"myindex","_type":"mytype"}}
    {"baz":"qux"}
    

    And if the file contains the documents in a top level list they have to be unwrapped first:

    $ echo '[{"foo":"bar"},{"baz":"qux"}]' | 
    jq -c '
    .[] |
    { index: { _index: "myindex", _type: "mytype" } },
    . '
    
    {"index":{"_index":"myindex","_type":"mytype"}}
    {"foo":"bar"}
    {"index":{"_index":"myindex","_type":"mytype"}}
    {"baz":"qux"}
    

    jq's -c flag makes sure that each document is on a line by itself.

    If you want to pipe straight to curl, you'll want to use --data-binary @-, and not just -d, otherwise curl will strip the newlines again.

提交回复
热议问题