Google Cloud Datastore: Bulk Importing w Node.js

≡放荡痞女 提交于 2019-12-11 17:54:26

问题


I'm need to write a huge quantity of entities (1.5 million lines from a .csv file) to Google Cloud Datastore. Kind of a 2 part question:

Can I do (or is kind a necessary property?):

const item = {
    family: "chevrolet",
    series: "impala",
    data: {
        sku: "chev-impala",
        description: "Chevrolet Impala Sedan",
        price: "20000"
    }
}

then, regarding importing I'm unsure of how this works. If I can't simply dump/upload/import a huge .json file, I wanted to use Node.js. I would like each entity to have an autogenerated universal id. Is there an asynchronous means of writing? I have a node script that is piping out a few hundred enteties/records at a time and pausing awaiting write resolve. ...which is what I'm looking for : a promise import.


回答1:


You can use Apache Beam to import data from a CSV file to Cloud Datastore. Take a look in to the thread: Import CSV into google cloud datastore.

How to work with entities is explained in the documentation here.

Exporting and Importing Entities is a fully managed service and you can import just entities previously exported with the managed export and import service.



来源:https://stackoverflow.com/questions/51025728/google-cloud-datastore-bulk-importing-w-node-js

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!