0

From Visual Studio 15, I am running an insert query to SQL Server. This is a Console Application. The application read from multiple files and enter near 200 000 records in a single run during which connection is closed and opened for any new folder of files so each folder provide more than (or less) a 50 000 entries. I have set myconn.CommandTimeout=0 So it would keep on running and SQL property Auto Close is set to off, both application and sql server are on same machine, It says

Db "my db name" log is not available

And then query stops please tell me how can I enter so many records without interruption. Please let me know if I have missed some information.

Original Error Output:

Unhandled Exception: System.Data.SqlClient.SqlException: The log for database 'paklaw' is not available. Check the event log for related error messages. Resolve any errors and restart the database. A severe error occurred on the current command. The results, if any, should be discarded. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action1 wrapCloseInAction) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, SqlDataReader ds, Boolean describeParameterEncryptionRequest) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource1 completion, Int32 timeout, Task& task, Boolean asyncWrite) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean asyncWrite) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at filereadproj.Program.sqlfunc(SqlConnection conn, Int32 bi, String br, Int32 sI, String sV, Int32 s2I, String s2V, Int32 tsI, String tsV, String fn, String of, Boolean ke) in C:\Users\Saif-ul-Islam\Documents\Visual Studio 2015\Projects\filereadproj\filereadproj\Program.cs:line 730 at filereadproj.Program.Main(String[] args) in C:\Users\Saif-ul-Islam\Documents\Visual Studio 2015\Projects\filereadproj\filereadproj\Program.cs:line 139

abdul qayyum
  • 535
  • 1
  • 17
  • 39
  • I don't think this has anything to do with inserting many rows. Sounds to me your log file is corrupt, or a failing disk on your server so you should check that. Also, when inserting a lot of records, see if you can use the SqlBulkCopy class (https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy(v=vs.110).aspx) to speed things up. – David Libido Dec 15 '16 at 21:05
  • log file is ok, one thing i noticed that if i execute program for low number f file then it work out well but it will be time consuming, I am using file to file insertion and there are not many records in a file so is this best option to use the bulk class? – abdul qayyum Dec 15 '16 at 21:23
  • 1
    If the log file is ok I'm not sure using SqlBulkCopy will resolve your issue. SqlBulkCopy is the fastest way to perform lots of inserts, so could also change your code to read multiple files and perform one (or more) bulk inserts, if possible. – David Libido Dec 15 '16 at 21:31
  • This may seem nonsensical, but it could fix the issue. After every file (or folder) has been inserted through BULK INSERT, try issuing a CHECKPOINT or CHECKPOINT 1 command. This commits things to disk and can stabilize performance on large bulk insert runs (this info comes from experience, not documentation). – Laughing Vergil Dec 16 '16 at 20:31
  • @LaughingVergil I have solved the problem by 1) splitting the data base 2) increasing the initial size of log file and 3) inserting using bulk insert as suggested by David Libido without knowing what solved the issue, don't if checkpoint will help but I will try to see behavior – abdul qayyum Dec 20 '16 at 20:15

0 Answers0