16

Is there any solution for bulk delete in SQL Server?

I can't use TRUNCATE because I want to use WHERE for limiting the rows in action.

Is there anything like Bulk Copy (bcp) for delete data?

gbn
  • 422,506
  • 82
  • 585
  • 676
masoud ramezani
  • 22,228
  • 29
  • 98
  • 151

7 Answers7

27

No.

You want a DELETE with a WHERE clause: this is standard SQL.

What you can do is batch deletes like this:

SELECT 'Starting' --sets @@ROWCOUNT
WHILE @@ROWCOUNT <> 0
    DELETE TOP (xxx) MyTable WHERE ...

Or if you want to remove a very high percentage of rows...

SELECT col1, col2, ... INTO #Holdingtable
           FROM MyTable WHERE ..opposite condition..
TRUNCATE TABLE MyTable
INSERT MyTable (col1, col2, ...)
           SELECT col1, col2, ... FROM #Holdingtable
gbn
  • 422,506
  • 82
  • 585
  • 676
  • 1
    And if it's a lot of transactions, depending on your backup model and log size, do more log backups to free up space in the log... – Cade Roux Jan 24 '10 at 15:10
  • Holding data in temporary tables will make you pay more memory usage. – Allan Chua Oct 18 '11 at 13:08
  • 4
    @Allan Chua: Can you prove that temp tables are held only in memory? – gbn Oct 18 '11 at 16:43
  • 1
    Ouch...learned the hard way...2h deleting on a batch when I could just apply this approach – Yaroslav Feb 28 '14 at 15:27
  • You should also make sure recovery model is `BULK_LOGGED` or `SIMPLE` and use `WITH TABLOCK` to lock the table for the second INSERT, otherwise the transaction log will still be huge. See http://social.technet.microsoft.com/wiki/contents/articles/20651.sql-server-delete-a-huge-amount-of-data-from-a-table.aspx – Dan Bechard Jul 14 '16 at 15:18
5

You can do a couple of things if you want to delete part of your table and not TRUNCATE it.

you could select a part of the table into a new table, then switch the two, like so:

SELECT *
INTO tmp_MyTable
FROM MyTable
WHERE Key='Value'

IF @@ROWCOUNT > 0
BEGIN
    EXEC sp_rename MyTable, old_MyTable, NULL
    EXEC sp_rename tmp_MyTable, MyTable, NULL
    TRUNCATE old_MyTable 
END

Secondly, if you're using Partitioning, you can create an identical (empty) table on the same partition scheme.. and if the table is partitioned according to your archiving / purging logic, you can move one partition block from your main table to the new table and then truncate the new table.. For example:

ALTER TABLE MyTable
SWITCH PARTITION 15 TO purge_MyTable PARTITION 2 
GO; 

TRUNCATE TABLE purge_MyTable 

Ps. Partitions are available in SQL 2005/08 Ent.

Hope this helps!

Aaronaught
  • 120,909
  • 25
  • 266
  • 342
Eyal Z.
  • 207
  • 2
  • 1
  • good answer. FYI, you can use ALTER TABLE..SWITCH without partitioning too, but then you need 2 discrete tables. – gbn Jan 22 '12 at 11:19
  • +1 Using the temp table is likely to get fastest results when deleting lots of rows. If the goal is to reclaim the space, I would also run `DBCC SHRINKFILE` for all db files at the end of the operation. – vgru Mar 25 '20 at 12:19
0

Just ran into a similar issue working on a staging table that had issues scaling with proper locks.

Since the relevant table is only referenced in one location for us, we simply replaced that reference with a query for the dynamic table name, which is created with a "select into" similar to what gbn suggested.

This is maintainable because the staging table is only referenced in one place in code, and the expense of the extra database call along with table creation is justified in a warehousing context. If you have only a few hundred records or reference the table in your code numerous times, then this approach may not work.

MichaelBolton
  • 78
  • 2
  • 7
0

When handling million of rows I prefer having a WHERE statement and using SELECT INTO a copy table, delete the original table and rename the copy (back to original name).

Though, you should have stuff like (FK)keys, constraints etc in mind. But using this method you avoid badazz-size of log, and you avoid a huge time consumption of deleting in chunks.

/Snedker

0

check this

  1. article from MSDN Delete_a_Huge_Amount_of_Data_from
  2. Information on Recovery Models
  3. and View or Change the Recovery Model of a Database
Icarus
  • 415
  • 4
  • 12
0

Sychare Jedko,

The advantage of TRUNCATE is to avoid logging every single delete in the log file. a TRUNCATE statement will create a single entry (in log) for the entire batch.

Eyal Z.
  • 207
  • 2
  • 1
0

You can use a loop to delete by chunks:

Here was my solution to delete 100 records at every time:

DECLARE @Rows INT
SET @Rows = 1

WHILE (@Rows > 0)
BEGIN
   
     with ToDelete(EMailNotificationID)
         as
         (
         select top 100 q.EMailNotificationID from email
         )

         delete from ToDelete


    SET @Rows = @@ROWCOUNT
END
Roberto Rodriguez
  • 3,179
  • 32
  • 31