0

I have a use case where I continuously need to trickle feed data into dashDB, however I have been informed that this is not optimal for dashDB.

Why is this not optimal? Is there a workaround?

Chris Snow
  • 23,813
  • 35
  • 144
  • 309

2 Answers2

0

Columnar warehouses are great for reads, but if you insert a single row into an N column table then the system has to cut the row into pieces and do N separate writes to disk. This makes small inserts relatively inefficient and things can slow down as a result.

You may want to do an initial batch load of data. Currently the compression dictionary is built only for bulk loads, so if you start with a new table and populate it only using inserts then the data doesn't get compressed at all.

Try to structure the loading into microbatches with a 2-5 minute load cycle.

Chris Snow
  • 23,813
  • 35
  • 144
  • 309
0

What is the use case here? Check if dashDB Transactional can solve your need. DashDB transactional is tuned for OLTP and point of sale transactions which is what you are trying to feed.

Kiran
  • 46
  • 1