1

I have an ASP.NET web application and I need to delete and insert multiple records in one shot.

Let's say a user clicks on a button. It will delete all the existing records for this user and insert a new set of records into the database.

For example, User A has 720 records in the database. When a user clicks on the Update button, it will delete the existing 700 records and insert a set of new records (let's say 854 records) into the database.

This function will be used frequently by the user.

I plan to do it in this way:

  1. Delete all records for a user

  2. Store all the records in a collection

  3. use a foreach to loop over the entire collection and insert records one by one

But I am afraid there might be a performance issue.

Is there any better way to do this?

dcaswell
  • 3,137
  • 2
  • 26
  • 25
My2ndLovE
  • 397
  • 4
  • 18
  • 1
    Can you state the reason you are worrying about performance?I don't see one. – Yosi Dahari Oct 19 '13 at 13:40
  • Can depend on the database. What database are you using? Insert one row at a time is not efficient but for less thean 10,000 rows it is still going to be pretty fast. – paparazzo Oct 19 '13 at 13:49
  • Visit here [Fastest way to insert 30 thounsan rows in a temp table on sql server][1] [1]: http://stackoverflow.com/questions/17028657/fastest-way-to-insert-30-thousand-rows-in-a-temp-table-on-sql-server-with-c-shar?rq=1 – Nitin Dominic Oct 19 '13 at 15:35

6 Answers6

3

If you have a collection of objects then I would:
Put it in a transaction
Delete
Insert using Table Valued Parameter (TVP)
(and sort the insert on the PK to minimize fragmentation)

The insert via the TVP is kind of like a reverse DataReader
If you had raw data you were parsing and loading I would say bulkcopy
But you state you have a collection
This is great reference on how to use a collection with TVP

Just a simple insert one row at a time for less than 1000 should be pretty fast.
If it is a very active table and it is hard to get a write lock is where that would have problems.
Even with a simple insert you can pack up to 100 values (),().
But I would still go with TVP for anything over 100 if you are looking for speed.

paparazzo
  • 44,497
  • 23
  • 105
  • 176
  • looks nice. I have a question. In the store proc part, it is insert one time for the whole TVP or it will loop through all the record in TVP and insert one by one? – My2ndLovE Oct 19 '13 at 16:09
  • It inserts records one by one but it is one insert command. It really is like the reverse of DataReader. It holds the hose open and pushes the records one by one. For sure sort your collection by the PK. That sort is in memory - a fragmented index (page splits) is on disk. – paparazzo Oct 19 '13 at 19:14
2

All You need to do is DELETE the user records before performing the INSERT:

Why not doing this as a database operation (run this code before inserting the user records):

DELETE FROM MyTable WHERE UserId = @UserId

About the fastest way to insert records you can find many posts:

How to do very fast inserts to SQL Server 2008

Fastest way to insert in parallel to a single table

Community
  • 1
  • 1
Yosi Dahari
  • 6,794
  • 5
  • 24
  • 44
2

You can use ORM like Entity Framework, which supports batch updates.

With Entity Framework you can do it as:

MyEntities dbContext = new MyEntities();
dbContext.Users.Add(new User(){Name="User1"});
dbContext.Users.Add(new User(){Name="User2"});
dbContext.SaveChanges();

To delete all records using Entity Framework:

dbContext.Database.ExecuteSqlCommand("delete from User");

you can do all changes in one batch:

//get all or needed users
List<User> users = dbContext.Users.ToList();

foreach (var user in users)
{
    if (CheckCondition) //update user
    {
        user.Name = "new name";
        dbContext.Entry(user).State = EntityState.Modified;
    }
    else if (CheckCondition) // delete user
    {
        dbContext.Entry(user).State = EntityState.Deleted;
    }
}
dbContext.Users.Add(new User() {Name = "Name3"}); //add new user
dbContext.SaveChanges(); //save all changes in one batch in a single transaction
Morbia
  • 4,144
  • 3
  • 21
  • 13
1

Why just not update the necessary field? well, if you worried about performance, use raw query instead using Big ORM. and judging from your need, I think it's better using transaction when you do the delete and insert operation.

here's that I think useful for you:

ado-net-sqltransaction-improves-performance

Community
  • 1
  • 1
reptildarat
  • 1,036
  • 2
  • 18
  • 35
0

If many of the records will stay the same or only need to be updated, you don't need a delete-insert pair. You can use the MERGE statement together with table-valued parameters to perform inserts, updates and deletes in one maximally efficient statement at once. This is likely to be the fastest way. MERGE is well-optimized to perform all writes at once in the best possible way.

My experience is that this technique works nicely in practice.

usr
  • 168,620
  • 35
  • 240
  • 369
0

you can insert bulk rows in a database table from Datatable.

use following i have used many times:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.UI;
    using System.Web.UI.WebControls;
    using Microsoft.ApplicationBlocks.Data;
    using System.Data.SqlClient;
    using System.Data;
    using AssamExitPollReport;
    using System.IO;
    using System.Drawing;
    using Spire.Xls;
    using System.Data.Sql;

    public partial class Default : System.Web.UI.Page
    {
        DataSet ds = new DataSet();
        clsdata obj = new clsdata();
        protected void Page_Load(object sender, EventArgs e)
        {

       string tablename = Request.QueryString["tablename"];

       // Response.ContentType = "text/xml";  //Set Content MIME Type. 
        Response.Write("<h3>Execution Started on " + DateTime.Now.ToLocalTime()+"</h3>");
        bool flg=false;
        if (tablename != null)
        {
            flg = BulkInsertDataTable(tablename);
            if (flg)
                Response.Write("<br><h3>Successfully executed on " + DateTime.Now.ToLocalTime() + "</h3>");
            else
                Response.Write("<br><h3>Oops! Something is wrong.</h3>");
        }
        else
            Response.Write("<br><h3>Oops! @parameter \"tablename\" is missing.</h3>");
    }



    //public bool BulkInsertDataTable(string tableName, DataTable dataTable)
    public bool BulkInsertDataTable(string tablename)
    {
        bool isSuccuss=true;
        try
        {
            string client = "Server=databasehost\\SQLEXPRESS;Database=dbname;Uid=username;Pwd=yourpassword;Trusted_Connection=no";
            ds = obj.Executedatasetcount("select_tablename");
            obj.ExecuteDataset("delete_temp", new object[] { tablename });
            using (SqlConnection destinationConnection = new SqlConnection(client))
            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
            {
                destinationConnection.Open();
                bulkCopy.DestinationTableName = tablename;
                bulkCopy.WriteToServer(ds.Tables[0]);
                destinationConnection.Close();

            }
        }
        catch (Exception ex)
        {
            isSuccuss = false;
        }

        return isSuccuss;
    }

    private void elseif(bool p)
    {
        throw new NotImplementedException();
    }
}

The clsdata clase is used for executing the procedure :

using System;
using System.Data;
using System.Configuration;
using System.Linq;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.HtmlControls;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Xml.Linq;
using Microsoft.ApplicationBlocks.Data;

namespace AssamExitPollReport
{
    public class clsdata
    {
       // string conn = "Data Source=ADMIN-B19C3BADF;Initial Catalog=bazarkhodro;Integrated Security=True";



        public DataSet Executedataset(string spName)
        {
            return SqlHelper.ExecuteDataset(System.Configuration.ConfigurationManager.ConnectionStrings["conn1"].ConnectionString, CommandType.StoredProcedure, spName);
        }
        public DataSet ExecuteDataset(string spName, object[] values)
        {
            return SqlHelper.ExecuteDataset(System.Configuration.ConfigurationManager.ConnectionStrings["conn1"].ConnectionString, spName, values);
        }
        public DataSet ExecuteDataset(string spName, object[] values, string conn)
        {
            return SqlHelper.ExecuteDataset(conn, spName, values);
        }
        public int ExecuteNonQuery(string spName, object[] values, string conn)
        {
            int i = 0;
            try
            {

                i = SqlHelper.ExecuteNonQuery(conn, spName, values);

            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
            return i;

        }

        public int ExecuteScaler(string spName, object[] values)
        {
            int i = 0;
            try
            {
                i = Convert.ToInt32(SqlHelper.ExecuteScalar(System.Configuration.ConfigurationManager.ConnectionStrings["conn"].ConnectionString, spName, values));
            }
            catch (Exception)
            {

                throw;
            }
            return i;
        }

        public int ExecuteScaler(string spName, object[] values, string conn)
        {
            int i = 0;
            try
            {
                i = Convert.ToInt32(SqlHelper.ExecuteScalar(conn, spName, values));

            }
            catch (Exception)
            {

                throw;
            }
            return i;
        }

    }
}

where conn(connection string) is initialized in web.config

FAISAL
  • 350
  • 3
  • 4