0

I'm need to write a huge quantity of entities (1.5 million lines from a .csv file) to Google Cloud Datastore. Kind of a 2 part question:

Can I do (or is kind a necessary property?):

const item = {
    family: "chevrolet",
    series: "impala",
    data: {
        sku: "chev-impala",
        description: "Chevrolet Impala Sedan",
        price: "20000"
    }
}

then, regarding importing I'm unsure of how this works. If I can't simply dump/upload/import a huge .json file, I wanted to use Node.js. I would like each entity to have an autogenerated universal id. Is there an asynchronous means of writing? I have a node script that is piping out a few hundred enteties/records at a time and pausing awaiting write resolve. ...which is what I'm looking for : a promise import.

Ronnie Royston
  • 16,778
  • 6
  • 77
  • 91

1 Answers1

2

You can use Apache Beam to import data from a CSV file to Cloud Datastore. Take a look in to the thread: Import CSV into google cloud datastore.

How to work with entities is explained in the documentation here.

Exporting and Importing Entities is a fully managed service and you can import just entities previously exported with the managed export and import service.

komarkovich
  • 2,223
  • 10
  • 20