21

I have a json file that I want to use to load my Dynamo table in AWS. In the AWS console, there is only an option to create one record at a time. Not good: )

Essentially my .JSON file is an array of objects which hold the data for each column in the table ie:

{
    "Column1": "Column1 Value",
    "Column2": "Column2 Value",
    "Column3": "Column3 Value",
    "Column4": "Column4 Value",
  },

Is there any way to do this via AWS console and importing my json file, or do I have to use AWS JS SDK to programmatically do this ??

29er
  • 8,595
  • 12
  • 48
  • 65

4 Answers4

17

The answer from E.J. Brennan looks correct, for a single record, but it doesn't answer the original question (which needs to add an array of records).

For this, the command is

aws dynamodb batch-write-item --request-items file://aws-requests.json

But, you'll need to make a modified JSON file, like so (note the DynamoDB JSON that specifies data types):

{
    "YourTableName": [
        {   
            "PutRequest": {
                "Item": { 
                    "Column1": { "S": "Column1 Value" },
                    "Column2": { "S": "Column2 Value" },
                    "Column3": { "S": "Column3 Value" },
                    "Column4": { "S": "Column4 Value" },
                }
            }
        },
        {
            "PutRequest": {
                "Item": { 
                    "Column1": { "S": "Column1 Value" },
                    "Column2": { "S": "Column2 Value" },
                    "Column3": { "S": "Column3 Value" },
                    "Column4": { "S": "Column4 Value" },
                }
            }
        }
    ]
}
carpiediem
  • 1,918
  • 22
  • 41
  • 3
    Hi there, I see this solution allows up to 25 writes per batch-write-item request. This is probably important to let people know (I just tried it for 1600) :) – Ash Oldershaw Dec 22 '20 at 12:25
13

You don't need to use the API. You could use the AWS-CLI instead, i.e:

aws dynamodb put-item --table-name MusicCollection --item file://item.json --return-consumed-capacity TOTAL

but you may need to tweak your JSON format a bit.

More examples and documentation here:

https://docs.aws.amazon.com/cli/latest/reference/dynamodb/put-item.html

E.J. Brennan
  • 45,870
  • 7
  • 88
  • 116
  • Thank you! yep I had to tweak the json but it worked. – 29er Apr 28 '18 at 18:32
  • @29er Can you tell us how did you tweak the json? From the docs I understand that `item.json` file contains only one item. Did you get to load an **array of items** from one single `item.json` file? This is what the question is about, isn't it? – Costin Apr 28 '18 at 20:46
  • 5
    hi, yea, after I modiified the JSON, i used `aws dynamodb batch-write-item --table-name mytable --request-items file://TableItems.json` – 29er Apr 30 '18 at 04:07
  • 1
    just to clarify, we followed the instructions here : https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/WorkingWithItems.html – 29er May 08 '18 at 19:11
  • 2
    But what does your tweaked JSON look like, @29er? – theineffablebob Apr 25 '19 at 20:08
  • 1
    @29er So what you're saying is that the answer to use "put-item" was incorrect? – NealeU Jun 12 '19 at 11:12
1

I used boto3 in python to load the data

import boto3
import json

dynamodbclient=boto3.resource('dynamodb')
sample_table = dynamodbclient.Table('ec2metadata')

with open('/samplepath/spotec2interruptionevent.json', 'r') as myfile:
    data=myfile.read()

# parse file
obj = json.loads(data)

#instance_id and cluster_id is the Key in dynamodb table 

    response=sample_table.put_item(
                              Item={
                                  'instance_id': instanceId,
                                  'cluster_id': clusterId,
                                  'event':obj

                              }
                              )

Here is a sample for javascript:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GettingStarted.Js.02.html#GettingStarted.Js.02.02

Anandkumar
  • 1,338
  • 13
  • 15
0

The code above doesn't read in individual JSON objects, if you wanted to do that from a JSON file with multiple objects:

import boto3
import json

dynamodbclient=boto3.resource('dynamodb')
sample_table = dynamodbclient.Table('ec2metadata')

with open('/samplepath/spotec2interruptionevent.json', 'r') as myfile:
    data=myfile.read()

# parse file
objects = json.loads(data)

#instance_id and cluster_id is the Key in dynamodb table 

   for object in objects:
       instance_id = object["instance_id"]
       cluster_id =  object["cluster_id"]
       sample_table.put_item=(item=object)
T.UK
  • 65
  • 2
  • 8