0

I make use of SqlBulkCopier to insert a large number of entries into our logging database.

The layout of the program is:

It receives a stream of data from the networks (other servers) it then parses the stream and builds up Log objects. (200 - 400) a second. I then add each log to the Sql DataTable Object.

I then increment a counter. Once i have 10000 logs I do the sqlBulkInsert.

now the issue I am having is that, if One of the rows doesn't fit the sql validation, Ie, the one field is to long etc. then I loose all the remaining logs.

Is there not a way to call validate on the data table for each Log Item I add to it, that way i can skip the invalid ones and keep all the valid ones safe.

Currently I am inserting one Item at time, and if it fails I ignore it and carry on with the next. But this obviously defeats the point and performance benefits of SqlBulk Copy.

Some Code:

 private DataTable _logTable;

 public void AddLog(Log log)
    {
        if (log.serverId != null || log.serverId > 1)
        {
            try
            {
                _logTable.Rows.Add(log.logId, log.messageId, log.serverId, log.time, log.direction, log.hasRouting,
                    log.selfRouting, log.deviceType, log.unitId, log.accountCode, log.clientId, log.data);

                if (_logBufferCounter++ > BufferValue)
                {
                    _logBufferCounter = 0;
                    using (var sbc = new SqlBulkCopy(_connectionString, SqlBulkCopyOptions.TableLock))
                    {
                        sbc.DestinationTableName = "dbo.Logs";
                        sbc.BulkCopyTimeout = 0;
                        sbc.WriteToServer(_logTable);
                        _logTable.Clear();
                        sbc.Close();
                    }
                }
            }

            catch (Exception e)
            {               
                Log.Error("Failed to write bulk insert for LOG Table", e);                
                _logTable.Clear();
            }
        }
        else
        {
            Log.Error("Server Id is null for LOG: " + LogToString(log));
        }
    }
Zapnologica
  • 22,170
  • 44
  • 158
  • 253
  • Why don't you validate the fields while creating the Log object? It should be correct by the time it's in the SqlBulkCopy. – slugster Oct 15 '14 at 06:40

1 Answers1

1

No, there is not.

But you can as a programmer do validation before inserting. Not exactly that hard, you know. And there is no need to have a HEAVY data table at all - use normal objects and hammer them into the SqlBulkDCopy instance using your own implementation of the required interface ;)

TomTom
  • 61,059
  • 10
  • 88
  • 148
  • I hear what you are saying, Could you elaborate on the DataTable being "HEAVY"? Is it bad practice to make use of data tables? Yes I agree I could implement validation in code before inserting, however i'm a fan of keeping validation rules in one place. But due to the performance needs of this application, I think that your suggestions of simply validating it in code would be the way to go. – Zapnologica Oct 15 '14 at 06:31
  • Also, Just an afterthought. How do I validate things such as Foreign Key Constraints , duplicate PK errors etc in my code without querying the database. – Zapnologica Oct 15 '14 at 08:35