I am loading about 4 million records from the data base and processing them one row at a time. To do so, I am iterating through each data row of the data set which is resulting in issues with efficiency.
How can I make this faster? I tried using a Parallel.Foreach loop. However, since I am dealing with Datarows, which aren't thread safe, implementing the lock block around the places where I am writing to datarows worsened the efficiency. The logic is about 3k lines long. So each row is being run through 3k lines.
Is there any way to make it more efficient? I was thinking about using a List of DataRows instead of using DataRows itself. If I make that change, will I be able to use Parallel.Foreach loop and expect better efficiency?
Or should I create an Entity class for the report and create a List of that entity class? Which one would be faster? List of System.DataRow or List of Entity class?
I do understand that this is a design problem, but there isn't much I can do in terms of that. I'd appreciate any kind of help. Thank you.