-1

I want to make a multiple insert so i will pass set of data , and i don't know which method is better from performance view .

passing a data table or list of objects like this :

 - public static int Insert(List<EndServReward> reward) //list of
   objects



 - public static int Insert(DataTable reward) //datatable
madth3
  • 7,275
  • 12
  • 50
  • 74
Anyname Donotcare
  • 11,113
  • 66
  • 219
  • 392

1 Answers1

1

Since neither of those actually shows any insert code, it is meaningless to comment. Generally speaking, the performance of any C# here is going to be vastly dwarfed by the latency of talking to an out-of-process database server that is presumably on a different machine on the LAN. However! Yes, it is possible for the library implementation to make a difference. As examples, Entity Framework and NHibernate are both big complicated products with lots of abstractions and complications - and as a result there can sometimes be measurable overheads. Other libraries are intentionally simple and lightweight, handling the 90% scenarios really efficiently, and not bothering to support the hard 10%. But: you don't indicate what library (if any) you are using, so we can't comment. Likewise, there are inbuilt overheads in things like DataTable - but you'd only notice them when dealing in non-trivial batch sizes.

If you want the fastest possible performance, then dropping to SqlBulkCopy via an IDataReader will be the winner - but that is only worth doing if you are inserting lots of data: you wouldn't bother with that to insert 20 rows. But if you are inserting 2000 rows, sure! FastMember provides a property-mapped IDataReader implementation for you - for example:

using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(list, "Id", "Name", "Description"))
{ //           members to include, or omit for all ^^^^^^^^
    bcp.DestinationTableName = "SomeTable";
    bcp.WriteToServer(reader);
}

Note that bulk-insert is not the way to go for every day-to-day insert into a database: again, this is to fit a specific scenario (efficient insert of large quantities of data).

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • I don't use any `ORM` in my case , and i use informix and my version has n't `IfxBulkCopy` so i have to loop on my datatable or the list of objects to insert and i don't know which is better for the memory . – Anyname Donotcare Oct 28 '13 at 08:52
  • @just_name from a *memory* view? probably the list of objects - `DataTable` is an abstraction to support arbitrary data models, including before/after snapshots : there are lots of overheads when dealing with `DataTable`. However, even without an ORM, a lot of this hinges on how your code is written - how efficiently it talks to ADO.NET etc. Frankly, I think you'd do well to look at a tool like [dapper](https://code.google.com/p/dapper-dot-net/) which is designed to make it hard to fail - it handles all the ADO.NET side for you, so you can focus on useful code. For example: – Marc Gravell Oct 28 '13 at 08:55
  • if each object has a `.Id`, `.Name` and `.Value`, then using TSQL (obviously tweak the syntax for your db vendor), something like `connection.Execute("insert [sometable] (Id, Name, Value) values (@Id, @Name, @Value)", listOfObjects);` is all you need to do for it to do the inserts for the entire list, handling parameterization, command management, etc – Marc Gravell Oct 28 '13 at 08:57