we have an application built with Entity Framework and SQL Server. The import data function is wrapped in a long running transaction. So all the actions in the "import data function" either succeeded or nothing is persisted into the database. However that long running transaction blocked the other transactions and affected users online at the time. All the isolation levels and optimistic concurrency control doesn't help as write operations still block write operations in the underlying database. It's also hard to put all the data modification operations to the end of the import function. Are there other solutions? Thanks in advance!
Asked
Active
Viewed 182 times
0
-
Do not write directly to the "production" table. First import and process your data in the staging table and only then copy it to "production" table. – Alex Nov 03 '17 at 03:22
-
Do you need to write all the data in one go or can you split that into batches? Only downside to that is, when you run into an issue it can be tricky to roll back. – Joey Bob Nov 03 '17 at 09:53
-
@JoeyBob Ideally we want all the import data and import steps to be in one go. But I think your suggestion is one option. Thanks. – Yang Nov 03 '17 at 11:04
-
@Yang I do this when I have to import in bulk 210,000 odd rows into a SQL table. We use a combination of Alex's answer and mine to batch them up in 10,000 to a staging table, the use a Stored Proc to copy the data over into live. This allows the rest of the application (user side) to continue unaffected until we've done our bit. If you'd like me to add the example code, I can do so in the answer. – Joey Bob Nov 03 '17 at 11:11