0

We are using Interbase XE3 (2013) with Delphi XE.

We need to export tables as CSV-File as they are imported into MsAccess for further controlling purposes.

For exporting we use the function of IBExpert and/or by our own program (Export as CSV-File via TClientDataSet from cxGrid).

The Problem: Unfortunately the export is way to limited and we have no solution for exporting the whole table (in this case 400k rows, approx. 80 columns, and it is increasing every month).

I've read about FBExport, but didn't test it yet as I do not want to risk problems with our database. Did anyone test this tool yet?

Suitable Solution (to be found): We need a way to export whole tables from Interbase XE3 into a CSV-File without a limitation in size/column/rows (that's why I think CSV is the way to go as there is no overhead). Also I would prefer a solution that can be executed via batch-file without the need for a person in front of a computer.

Kind regards

Matze
  • 1
  • 3
  • Please be more specific, and when editing, introduce paragraphs by adding a blank line to make your question more readable. – AlBlue Aug 02 '16 at 12:55
  • 1
    why not using ODBC or OLE-DB providers and make Access directly connect into Firebird / Interbase ? – Arioch 'The Aug 02 '16 at 13:57
  • Just put your database into read-only state and `FBExport` or `FBCopy` or `FlameRobin` or any other tool would not be able to alter the data in it. – Arioch 'The Aug 02 '16 at 13:58
  • but since you already use Delphi you can just make your one custom-tailored exporter. Reading part can be https://github.com/hgourvest/uib/tree/master/examples/UIB/Component/DataPump or https://github.com/hgourvest/uib/tree/master/examples/UIB/Component/CloneDatabase - then use http://TDBF.sf.net component to save every table to a standalone file. Make your application bat-callable and here you go. – Arioch 'The Aug 02 '16 at 14:01
  • I'm confused what the question is here. Are you asking us to write the entire code for an application to export your Interbase data as CSV? – Ken White Aug 02 '16 at 17:01
  • Thanks for your answers. I'm not able to set the database to read only as this is a live system and our test db has only data from last week which is too old. ODBC would be possible, I think. But I don't know if I would run into some file limitations. I want to export the data so that the live system don't get a performance hit from the complex queries. I could think of your solution, Arioch. So query results in for example 100.000er steps and writing it into an array and finally exporting//appending it to csv could work? I hope I don't get an oom.. :) – Matze Aug 03 '16 at 05:20

1 Answers1

0

I've managed to answer this questions thanks to Arioch.

I have 2 parameters, "name of db" and "rows to fetch" (+optional divider). My program determines the number of rows in a table (just Count(ID)) and divides it to pieces in "rows to fetch" (as I got out of memory before)

the export file is created at the beginning with the column names I got from my ibcquery as first line. Also the max. width of my multidimensional array is set by columncount of table. The length is the "rows to fetch".

Statement is like "SELECT * FROM TABLENAME ROWS X TO Y". This is executed for every divided part of the table and written to my array. After filling the array the query gets closed and the data is appended to my CSV-File. I free the array and the next part gets loaded until the whole table is written to my file.

So the next limitation would be the file size, I think.

Thanks four your help!

Matze
  • 1
  • 3