0

I need to bulk import says 100 records into Cosmos DB.

I found dt.exe, that doesn't help. it throws error when importing csv into cosmos db with table api.

I'm not able to find any reliable way to automate this process.

Sathish Naga
  • 1,366
  • 2
  • 10
  • 18

1 Answers1

0

The command-line Azure Cosmos DB Data Migration tool (dt.exe) can be used to import your existing Azure Table storage data to a Table API GA account, or migrate data from a Table API (preview) account into a Table API GA account. Other sources are not currently supported. The UI based Data Migration tool (dtui.exe) is not currently supported for Table API accounts.

According to the above official statement, it seems that other sources(e.g csv file) are not supported to be migrated into Azure Table API account. You could adopt a workaround: Read csv file in the program then import data into Azure Table Storage.

Please refer to the sample python code which I did in this thread.

from azure.cosmosdb.table.tableservice import TableService
from azure.cosmosdb.table.models import Entity
import csv
import sys
import codecs

table_service = TableService(connection_string='***')

reload(sys)
sys.setdefaultencoding('utf-8')
filename = "E:/jay.csv"

with codecs.open(filename, 'rb', encoding="utf-8") as f_input:
    csv_reader = csv.reader(f_input)
    for row in csv_reader:
        task = Entity()
        task.PartitionKey = row[0]
        task.RowKey = row[1]
        task.description = row[2]
        task.priority = EntityProperty(EdmType.INT32, row[3])
        task.logtime = EntityProperty(EdmType.DATETIME, row[4])
        table_service.insert_entity('tasktable', task)

Or you could commit feedback here.

Hope it helps you.


Just for minor update:

If you use python 3.1, there is no need for reload(sys) and sys.setdefaultencoding('utf-8') with 'r' filename = r"E:/jay.csv"

Jay Gong
  • 23,163
  • 2
  • 27
  • 32