0

I have an use case, where we get requests, each request has around 1 million line items. As data is very important, we persist those 1 million entries(per request) to oracle db in chunks of size 100. Can you please suggest me with the best practices to dump those 1 million entries to db, if possible all at one go. I remember we have fastupload tool for Teradata to save 1M entries. Do we have anything similar to that in Oracle. In my use case, we pre-fetch PKs from a sequence generator. Thanks in advance.

APC
  • 144,005
  • 19
  • 170
  • 281
paul
  • 3
  • 2
  • 3
    You need to provide a bit more information about how exactly you are inserting data. Where those data are coming from. Define `a lot of time`. How much time do you expect it to take? – Nick Krasnov May 15 '17 at 16:05
  • 1
    A well tuned SQL\*Loader task could do a million rows pretty fast. But as @NicholasKrasnov says, we need a lot more details to give specific help. The news is that the hints you have dropped ("chunks of size 100", "we pre-fetch PKs ") suggests that improvements re almost certainly possible. – APC May 15 '17 at 16:11

0 Answers0