I'm loading datafiles to a staging table which works perfectly fine, but after that I need to convert to several datatypes for the final table and the insert statement becomes not feasible for big tables. The statements are created by a tool of mine and I want to optimize this part. The insert statements look something like that, but with lot more columns, validity checks and some needed replacements inside these converts.
INSERT INTO foo
SELECT
convert(decimal(10,2),col1),
convert(date, col2),
convert(decimal(10,2),col3),
convert(int, col4),
convert(decimal(10,2),col5)
...
from foo_staging
On small tables this works pretty fast, but as the table becomes bigger the performance becomes really really bad. Any ideas how to improve the performance of the convert?
EDIT: I'm using Bulkinsert into the staging table and the files are really big.