-1

I am doing web application using c# .net and sql server 2008 as back end. Where application read data from excel and insert into sql table. For this mechanism I have used SQLBulkCopy function which work very well. Sql table has 50 fields from which system_error and mannual_error are two fields. After inserting records in 48 columns I need to re-ckeck all this records and update above mentioned two columns by specific errors e.g. Name filed have number, qty Not specified etc. For this I have to check each column by fetching in datatable and using for loop.

Its work very well when record numbers are 1000 to 5000. But it took huge time say 50 minutes when records are around 100,000 or more than this.

Initially I have used simple SQL Update Query then I had used stored procedure but both requires same time.

How to increase the performance of application? What are other ways when dealing with huge data to update? Do suggestions.

  • Hello, If I'm correct: Data are uploaded to SQL by SQLBulkCopy and then some post processing validation comes which check row by row, column by column and validation result of each row is then writen to column system_error and mannual_error. Is it correct? I understand why you use SQLBulkCopy it's very fast. Have you think about to do such validation in application? Keep chunk of excel in memory variable, do the validation write result into DB to _error columns of corresponding rows and then take another chunk of excel for validation? – Jaroslav Kubacek Mar 14 '14 at 11:27

2 Answers2

0

I hope this is why people use mongodb and no SQL systems. You can update huge data setsby optimizing your query. Read more here: http://www.sqlservergeeks.com/blogs/AhmadOsama/personal/450/sql-server-optimizing-update-queries-for-large-data-volumes

Also check:Best practices for inserting/updating large amount of data in SQL Server 2008

Community
  • 1
  • 1
Abhishek Dey
  • 1,601
  • 1
  • 15
  • 38
0

One thing to consider is that iterating over a database table row by row, rather than performing set based update operations would incur a significant performance hit.

If you are in fact performing set based updates on your data and still have significant performance problems you should look at the execution plan of your queries so that you can workout where and why they are performing so badly.