0

I have to insert more than 2000 records at a time with blob fields in the Interbase database. There is a performance issue if we execute query for every insert.

I found out that batch insert is the solution. But, I have to prepare all the queries first to run insert queries' scripts. My question is, how to prepare the queries' scripts with blob fields?

Please suggest me if there is another solution to improve the performance.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
shyambabu
  • 169
  • 11
  • Explain what you mean by "performance issue". 100 records is a is a trivially small volume that should not usually take a long time, unless you've done something wrong. However, if your blob values are "large", then your performance hit is function of the amount of blob data you're trying to insert. And batch insert is unlikely to help. – Disillusioned Dec 30 '16 at 07:14
  • Sorry, Its my mistake. The number of queries are more than 2000.I will edit the question. – shyambabu Dec 30 '16 at 07:45
  • You still need to clarify your performance requirements. What is the current performance, what is your target? – Disillusioned Dec 30 '16 at 07:46
  • Its taking one and half minute for insertion. And I am trying to take it to 2 to 3 secs. – shyambabu Dec 30 '16 at 07:52
  • how do you manage transactions? In Firebird.... well.... you would be able to insert small BLOBs as HEX strings into script, but even that would be... well... exotic and hardly normal. The engine just works a bit differently with blobs than with regular values. So I think you are to try other approaches, like tuning your queries-per-transaction ratio and tuning server/database configurations. – Arioch 'The Dec 30 '16 at 08:25
  • 3
    anyway, this is programmers site, so I think you have to show the code - both the SQL table schema (full schema - including indices, triggers, constraints, etc, like one produced by IBExpert). And the Delphi code doing that insert. – Arioch 'The Dec 30 '16 at 08:26

0 Answers0