0

How do you insert large quantities of JSON data from a REST API that is not Cloudant into dashDB?

  • bjoern
Bjoern
  • 1
  • In the case of Cloudant to dashDB, the JSON data is transformed to relational format. Does that apply to your question? How is the intended format for dashDB? – data_henrik Aug 17 '15 at 06:13
  • While not a REST API, the `db2nosql` command line tool can import JSON documents from a file, to DB2 or dashDB: http://www.ibm.com/developerworks/data/library/techarticle/dm-1306nosqlforjson2/ Also, there's info on how to use Java/drivers to insert JSON here: http://www.ibm.com/developerworks/views/data/libraryview.jsp?search_by=DB2+JSON+capabilities – SilentSteel Jun 23 '16 at 20:51

1 Answers1

1

Agreed with Aislinn, Simple Data Pipes is designed to move large quantities of JSON data via REST APIs: see on GitHub At the moment, Version 2 of pipes is coming with a SalesForce and Stripe billing connector that lets you move the data to dashDB.

It also adds a connector API that allows you to easily write your own custom connector.

You can check out a sample connector here.

This sample connector shows you how to simply move some hard-code json records. You can use it as a starter to build your own.

A complete tutorial is in the works and should be released some time next week.

Supamiu
  • 8,501
  • 7
  • 42
  • 76
David Taieb
  • 206
  • 2
  • 8
  • It's probably worth stating that you shouldn't do a row-by-row load into dashDB as you will probably not get very good compression. Instead batch up the data and use an api like https://developer.ibm.com/clouddataservices/docs/dashdb/rest-api/ to bulk load the data. – Chris Snow Jun 23 '16 at 16:03