-2

I'm wondering if there is a limit to how many records that can be updated and submitted at one time using LINQ, my query looks like?

var reports = prdb.Reports
              .Where(w => w.FileYear == FileYear)
              .AsQueryable();

foreach (var r in reports)
{
    // update the report teacher name field
    var user = syndb.GetUser(r.CreatedBy);
    string strTeacherName = user.given + " " + user.Surname;
    r.TeacherName = strTeacherName;
}
prdb.SubmitChanges();

This query could bring back 1000's of records, will the query execute or crash?

JeffreyJ
  • 99
  • 2
  • 12
  • 2
    Why do not you try and check whether it is working or not? – Gaurang Dave Apr 13 '18 at 02:48
  • I think your problem would be how long will it take to update all those data rather than not working. – Zwei James Apr 13 '18 at 03:17
  • Gaurang, the query will be executed on a production database, so I need to know it will work and not crash. – JeffreyJ Apr 13 '18 at 03:18
  • Zwei, if that is the case I'm happy with a lengthy query execution time – JeffreyJ Apr 13 '18 at 03:22
  • 1
    Related: [How often to call DataContext.SubmitChanges() for a large number of inserts?](https://stackoverflow.com/questions/3154284/how-often-to-call-datacontext-submitchanges-for-a-large-number-of-inserts) and [Can I do a very large insert with Linq-to-SQL?](https://stackoverflow.com/questions/5166338/can-i-do-a-very-large-insert-with-linq-to-sql) – Lance U. Matthews Apr 13 '18 at 03:36
  • Related: [How do I avoid a memory leak with LINQ-To-SQL?](https://stackoverflow.com/questions/123057/how-do-i-avoid-a-memory-leak-with-linq-to-sql) – Lance U. Matthews Apr 13 '18 at 03:43
  • Okay, so I went with my LINQ query which updated between 1200 to 1600 reports at a time, there were no problems encountered at all. – JeffreyJ Apr 27 '18 at 03:42

2 Answers2

1

I would suggest to make use of stored procedure with merge statement in case you want to make a transaction else simple SQL Stored procedure to perform this operation.

Best approach is to go with SqlBulkCopy of SQL server. Entity framework is not designed for large bulk operations. SqlBulkCopy will provide much more optimized performance.

Sweta Nair
  • 146
  • 6
1

This query could bring back 1000's of records, will the query execute or crash?

This depends on many factors: Do you got enough ram? Do you get a NullReferenceException at Element 1001? Do you get a transaction timeout after 3 minutes(if working on DB)?

Simple answer is: You can work with even 100 Million records, if the envirement got the right conditions.

To answer the question a bit abroad(I guess you're working a query on a Database): I guess you're looking for a way to do this most efficent. If you need to update a million of records with a fixed value or a value depending on another value in line, use sth. like update table set teachername = (select surname || ', ' || lastname from users where userid = table.teacherid); and execute it via Context.Database.ExecuteSqlCommand(".."). The database itself is faster by updating them simultanesly then a program unsing LinQ above that updates every single entry (1 per statement).

If you got complex operations and individual updates depending on user inputs or something, you perhaps need to do it i.e. via EF and LinQ queries.

Do tests (use Stopwatch), try everything and learn!

Chrᴉz remembers Monica
  • 1,829
  • 1
  • 10
  • 24