I need to re-create a database from a DashDB Bluemix service into another. And I need to automate this procedure in bash scripts. The best I can think of is a DashDB REST API that allows me to export the content of the entire database into json format (or any other format you can think of), and a corresponding API that allows me to re-import the content in a different database on the same service or on a different service, possibly in a different Bluemix space. Thanks.
-
Why not use the DB2 `export` and `import` commands? – mustaccio Oct 12 '16 at 17:51
-
See also http://stackoverflow.com/questions/39740976/dashdb-and-db2-load-operation – mustaccio Oct 12 '16 at 17:53
-
That's an alternative we are considering, however we would like do not add too many prerequisites to our automation. – michele crudele Oct 13 '16 at 07:39
2 Answers
I assume you want to do a one time move and this is not about a continuous replication. In that case simply sign up on http://datascience.ibm.com, navigate to DataWorks, select "Load Data" from navigation panel (open it clicking top left) and then select Cloud Database as source type.
DataWorks load data from dashDB to dashDB
If you however still would prefer to write an own app or script that does the data movement and you want a REST API to export JSON data, then I recommend to write a simple R script that reads the data from a table (using ibmdbR) and writes it to stdout, deploy the script into dashDB (POST /home) and run the R script from your app/script calling /rscript endpoint: https://developer.ibm.com/clouddataservices/wp-content/themes/projectnext-clouddata/dashDB/#/

- 391
- 2
- 5
For Db2 on Cloud and Db2 Warehouse on Cloud, there is a REST API available that allows you to export data from a table in CSV format (up to 100.000 rows) and then load the data back. It will require a few requests as:
- POST /auth/tokens
- GET /sql_query_export
- POST /home_content/{path}
- POST /load_jobs
- GET /load_jobs/{id}
I've implemented a client npm module for this API - db2-rest-client and you can export a statement result to a JSON file as:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<SOURCE_HOSTNAME>/dbapi/v3'
db2-rest-client query --query="SELECT * FROM SRC_SCHEMA.SRC_TABLE" > test.json
Then you can transform that data into a .csv file and use the load
job:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<TARGET_HOSTNAME>/dbapi/v3'
db2-rest-client load --file=transformed.csv --table='DEST_TABLE' --schema='DEST_SCHEMA'

- 89
- 4