I have this pattern in a number of stored procedures
-- Table1
[id] [int] IDENTITY(1,1) NOT NULL
[data] [varchar](512) NULL
[count] INT NULL
-- 'data' is unique, with a unique index on 'data' in 'Table1'
BEGIN TRY
INSERT INTO Table1 (data, count) SELECT @data,1;
END TRY
BEGIN CATCH
UPDATE Table1 SET count = count + 1 WHERE data = @data;
END CATCH
I've been slammed before for using this pattern
You should never have exception "catching" in your normal logic flow. (Thus why it is called an "exception"..it should be exceptional (rare). Put a exists check around your INSERT. "if not exists (select null from Data where data = @data) begin /* insert here */ END
However, I can't see a way around it in this instance. Consider the following alternative approaches.
INSERT INTO Table1 (data,count)
SELECT @data,1 WHERE NOT EXISTS
(SELECT 1 FROM Table1 WHERE data = @data)
If I do this, it means every insert is unique, but I can't 'catch' an update condition.
DECLARE @id INT;
SET @id = (SELECT id FROM Table1 WHERE data = @data)
IF(@id IS NULL)
INSERT INTO Table1 (data, count) SELECT @data,1;
ELSE
UPDATE Table1 SET count = count + 1 WHERE data = @data;
If I do this, I have a race condition between the check and the insert, so I could have duplicates inserted.
BEGIN TRANSACTION
DECLARE @id INT;
SET @id = (SELECT id FROM Table1 WHERE data = @data)
IF(@id IS NULL)
INSERT INTO Table1 (data, count) SELECT @data,1;
ELSE
UPDATE Table1 SET count = count + 1 WHERE data = @data;
END TRANSACTION
If I wrap this in a TRANSACTION
it adds more overhead. I know TRY/CATCH
also brings overhead but I think TRANSACTION
adds more - anyone know?.
People keep telling me that using TRY/CATCH
in normal app logic is BAD, but won't tell me why
Note: I'm running SQL Server 2005 on at least one box, so I can't use MERGE