Timeout issue:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.\r\nThe the statement has been terminated.
I am having 17 millions of records to dump in my application database.These 12 million records are the result of comparison operation between 2 database records.
I compare 2 database records then populate mismatch records (based on some criteria) in the data table and once that data table reaches some limit like 1000 or 500 etc I send this data table to SQL bulk copy for bulk import and then empty the data table.
I am doing this whole operation inside the transaction so that I have inserted X records and during my comparison process any error comes so i will rollback those X records.
But because of this, I am getting a timeout issue doing then bulk copy.
I have checked varying different batchsize like 5000,1000,500,300
etc.I am getting timeout issues in all this batch size.
Once I have set bulk-copy timeout to 0 but then I go this below error :
The transaction log for my database is full.
With 1000 records it reaches 2.7 million and then throws timeout issues,
With 500 records it reached some 2.1 million records then throws an error.
With 300,200,100 also it is throwing timeout errors.
I also have set connection timeout in my connection string to 30 minutes.
Code :
public class SaveRepo : IDisposable
{
DataTable dataTable;
SqlConnection connection;
string connectionString;
SqlTransaction transaction;
SqlBulkCopy bulkCopy;
int testId,
public SaveRepo (int testId)//testId=10364
{
this.connectionString = connectionString;
dataTable = new DataTable();
connection = new SqlConnection(connectionString);
connection.Open();
transaction = connection.BeginTransaction();
bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction);
bulkCopy.BulkCopyTimeout = 60;
bulkCopy.EnableStreaming = true;
bulkCopy.DestinationTableName = "dbo.Sales";
bulkCopy.BatchSize = 100;
bulkCopy.SqlRowsCopied +=
new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
bulkCopy.NotifyAfter = 100;
}
void Dump()
{
try
{
bulkCopy.WriteToServer(dataTable);
}
catch(Exception ex) // timeout error
{
throw ex;
}
}
void FillDatatable(object[] row)
{
if (dataTable.Rows.Count == 100)
{
Dump();
dataTable.Clear();
}
dataTable.Rows.Add(row);
}
public void End()
{
transaction.Commit();
//dispose the stuffs also
}
}
Is there any other way or solution which I am missing and can solve this timeout issue?
Update : After setting BulkCopyTimeout
to 0 and having batchsize =1000
i got this error till 3593000 records bulk copied
:
Could not allocate space for object 'dbo.Sales'.'PK_dbo.Sales' in database 'XYZ' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
Update 2 : I removed the transaction and i will open and close connection for each of the batch and while dumping any batch if error occurs then i will removed all those previously saved data using testId
.Now this works up to dumping 3 millions of data
then i get this error :
Could not allocate space for object 'dbo.Sales'.'PK_dbo.Sales' in database 'XYZ' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.
This goes in catch section where i try to remove old data based on testId
but it takes so long and then it throws this error :
The transaction log for my database is full.
void Dump()
{
using (SqlConnection connection =
new SqlConnection(connectionString))
{
connection.Open();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connectionString))
{
bulkCopy.DestinationTableName = "dbo.Sales";
bulkCopy.EnableStreaming = true;
try
{
bulkCopy.WriteToServer(dataTable);
}
catch(Exception ex)
{
connection.Close();
SalesRepo.Delete(connectionString, testId);
}
}
}
}