How can I import bulk data from a CSV file into DynamoDB?

后端 未结 14 2008
我在风中等你
我在风中等你 2021-01-31 15:08

I am trying to import a CSV file data into AWS DynamoDB.

Here\'s what my CSV file looks like:

first_name  last_name
sri ram
Rahul   Dravid
JetPay  Underw         


        
14条回答
  •  误落风尘
    2021-01-31 16:07

    Here's a simpler solution. And with this solution, you don't have to remove empty string attributes.

    require('./env'); //contains aws secret/access key
    const parse = require('csvtojson');
    const AWS = require('aws-sdk');
    
    // --- start user config ---
    
    const CSV_FILENAME = __dirname + "/002_subscribers_copy_from_db.csv";
    const DYNAMODB_TABLENAME = '002-Subscribers';
    
    // --- end user config ---
    
    //You could add your credentials here or you could
    //store it in process.env like I have done aws-sdk
    //would detect the keys in the environment
    
    AWS.config.update({
        region: process.env.AWS_REGION
    });
    
    const db = new AWS.DynamoDB.DocumentClient({
        convertEmptyValues: true
    });
    
    (async ()=>{
        const json = await parse().fromFile(CSV_FILENAME);
    
        //this is efficient enough if you're processing small
        //amounts of data. If your data set is large then I
        //suggest using dynamodb method .batchWrite() and send 
        //in data in chunks of 25 (the limit) and find yourself
        //a more efficient loop if there is one
        for(var i=0; i

提交回复
热议问题