问题
I'm need to write a huge quantity of entities (1.5 million lines from a .csv file) to Google Cloud Datastore. Kind of a 2 part question:
Can I do (or is kind
a necessary property?):
const item = {
family: "chevrolet",
series: "impala",
data: {
sku: "chev-impala",
description: "Chevrolet Impala Sedan",
price: "20000"
}
}
then, regarding importing I'm unsure of how this works. If I can't simply dump/upload/import a huge .json
file, I wanted to use Node.js. I would like each entity to have an autogenerated universal id. Is there an asynchronous means of writing? I have a node script that is piping out a few hundred enteties/records at a time and pausing awaiting write resolve
. ...which is what I'm looking for : a promise import.
回答1:
You can use Apache Beam to import data from a CSV file to Cloud Datastore. Take a look in to the thread: Import CSV into google cloud datastore.
How to work with entities is explained in the documentation here.
Exporting and Importing Entities is a fully managed service and you can import just entities previously exported with the managed export and import service.
来源:https://stackoverflow.com/questions/51025728/google-cloud-datastore-bulk-importing-w-node-js