1

I have instantiated a Db2 Warehouse on IBM Cloud and created a table in that Db2 instance. Now I want to load that table with data that resides in a local *.csv file.

I am able to create a bearer token:

curl -X POST "https://dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net/dashdb-api/v2/auth" -H  "accept: application/json" -H  "content-type: application/json" -d "{  \"userid\": \"dash14096\",  \"password\": \"************\"}"

Also, I can fetch the table meta information:

curl -X GET "https://dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net/dashdb-api/v2/schemas/DASH14096/tables/MYTABLE" -H  "accept: application/json" -H  "Authorization: Bearer eyJ0eXAiOiJKV1Q..."

I have got local file "mydata.csv" with the following content:
1;ONE
2;TWO
3;THREE

I am actually able to load that data using the V1 REST interface:

curl --user "dash14096:************" -H "Content-Type: multipart/form-data" -X POST -F loadFile1=@".\mydata.csv" "https://dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net/dashdb-api/load/local/del/DASH14096.MYTABLE?hasHeaderRow=false&delimiter=;&loadAction=REPLACE"

BUT: How can I load the data into my Db2 Warehouse table using the V2 or V3 REST interface? In particular, if I need to use the /load_jobs function, how to build a multipart REST request when the body must hold the load options?

Any hint is appreciated. Thanks!!

josoft
  • 11
  • 3

1 Answers1

0

The process to load data from a local CSV file to Db2 on Cloud using V3 APIs is quite complex and will require the following steps:

  1. generate the auth token - POST /auth/tokens
  2. upload the local file to the DB2 instance - POST /home_content/{path}
  3. create a load job using the file uploaded at the previous step - POST /load_jobs
  4. repeatedly check the load job for completion using the id from step 3 - GET /load_jobs/{id}

You can alternatively use the db2-rest-client npm module:

export DB_USERID='dash14096'
export DB_PASSWORD='************'
export DB_URI='https://dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net/dbapi/v3'
db2-rest-client load --file=mydata.csv --table='MYTABLE' --schema='MYSCHEMA' --type=INSERT

If you want to check the payload for the requests and build your own CURL commands in a script - you may clone the git repository and debug the calls for the load job in the integration tests. In this case, you'll need to build a loop check for the load job completion.

adtanasa
  • 89
  • 4