We have a requirement to read 100 million records ,process them and then insert them into a DB2 table as part of our application development.
We use index on primary keys, but not sure if the insertion process will take long time like in hours.
We are doing hashing on field names of the target table for distribution of data across different nodes. so if we think of partition table/hashing of field data, it would not help for insertion I guess
I would like to know what are available options to handle 100 million records insertions into DB2 database table efficiently using Java. we are using spring batch(batch of 9k records at a time).
Thanks in advance.