0

I'm in process of migrating data from DB2 to SQL Server using linked server and open query, like below:

  --SET STATISTICS IO on 
 -- Number of records are: 18176484
 select * INTO [DBName].[DBO].Table1
 FROM OPENQUERY(DB2,
    'Select * From OPERATIONS.Table1')

This query is taking 9 hrs and 17mins (number of record 18176484) to be inserted.

Is there any other way to insert records more quickly? Can I use "OpenRowSet" function to do the bulk insert? OR an SSIS package will increase the performance and will take less time? Please help

AskMe
  • 2,495
  • 8
  • 49
  • 102
  • I think the `bulk insert` just separate your insert procedure into few batches, may not help to fasten the process – LONG Mar 27 '17 at 17:44
  • Any other suggestion? – AskMe Mar 27 '17 at 17:46
  • Do you have an identity column on that table? If so, you can use that to split the transfer operations into different batches, using the identity as a key to knowing what records you've already migrated. Splitting into different batches would very likely help speed things along. – Siyual Mar 27 '17 at 17:46
  • since your `openquery` is selecting everything, there is no need to try the distributed query again, may increase performance the DB2 system... – LONG Mar 27 '17 at 17:47
  • No. But we can insert one. If I implement identity column, what would be the quicker solution? – AskMe Mar 27 '17 at 17:48
  • the identity column will take more system resource to finish the insert process – LONG Mar 27 '17 at 17:49
  • I worked extensively with DB2 to SQL Server and could not find a way to speed it up. By the way when I stopped the connection DB2 (mid-transfer) did not clear up the connection for a very long time (at least an hour). After blaming me the DB2 person finally traced and discovered that it was DB2's fault. – benjamin moskovits Mar 27 '17 at 19:02
  • The only way I was allowed to connect to DB2 was by using a linked server connection. – benjamin moskovits Mar 27 '17 at 19:09

1 Answers1

0

You probably want to export the data to a csv file such as this answer on StackOverflow:

EXPORT TO result.csv OF DEL MODIFIED BY NOCHARDEL SELECT col1, col2, coln FROM testtable;

(Exporting result of select statement to CSV format in DB2)

Once its a CSV file you can import it into SQL Server using either BCP or SSIS both of which are extremely fast especially if you use file lock on the target table.

Community
  • 1
  • 1
benjamin moskovits
  • 5,261
  • 1
  • 12
  • 22
  • I have not tried this. But will CSV file hold 1Billon records? Because I have one table having more than 1Billion records. How many maximum records one CSV file will support? Also, do I need to use any tool for exporting data to CSV from DB2? OR just your command will work? from where I need to run this command? – AskMe Mar 28 '17 at 01:41
  • You can have very large CSV files that SQL Server will have no problems with. I would do the arithmetic and try to keep (the total size of the CSV files in bytes) below 4 gig. You can export 50 million rows at a time (depending on the row size) and bcp import each file into SQL Server. The 'export' command is a standard (sic) command in DB2. – benjamin moskovits Mar 29 '17 at 13:10