How to efficiently import many large JSON files directly from S3 into MongoDB

前端 未结 2 1725
忘了有多久
忘了有多久 2021-02-06 10:25

I have compressed JSON files in S3 and I would like to set up MongoDB in EC2 to server json documents contained in these files. The compressed files are >100M and there are 1000

2条回答
  •  無奈伤痛
    2021-02-06 11:00

    You don't need to store intermediate files, you can pipe the output of s3 file to stdout and you can get input to mongoimport from stdin.

    Your full command would look something like:

    s3cmd get s3:// - | mongoimport -d  -c 
    

    note the - which says send the file to stdout rather than to a filename.

提交回复
热议问题