1

I am trying to insert data from a third party (Sage)'s Rest API to Azure SQL directly using C# code.

While I am trying to add about 30,000 rows of data, I got bottom error at line 66 where it has

resultList.AddRange(result.Data);

I only tried with three columns of data, and I cannot add more columns.

Currently, I am using 'basic' tier in Azure SQL database.

I am not sure what I could improve on C# code to make it more efficient (like using better way to add List).

Or should I just upgrade tier in Azure SQL database which has better DTU?

Or should I not covert (Serialize) JSON to Azure SQL, but insert in JSON format in Cosmos DB directly first and later move to Azure SQL?

Bottom is error message:

System.Data.SqlClient.SqlException
HResult=0x80131904
Message=Timeout expired.  The timeout period elapsed prior to completion of the operation or 
the server is not responding.
Source=Core .Net SqlClient Data Provider
StackTrace:
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean 
breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, 
Boolean callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ThrowExceptionAndWarning(Boolean 
callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, 
UInt32 error)
at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, 
SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject 
stateObj, Boolean& dataReady)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)
at Sage0413.Program.Main(String[] args) in C:\Users\source\repos\Sage\Main.cs:line 66

This is part of C# code where it has List.AddRange:

List<dynamic> resultList = new List<dynamic>();
do
{
    query.Offset = offset;
    query.PageSize = pagesize;

    Task<OnlineResponse> task = client.Execute(query);
    OnlineResponse response = task.Result;
    Result result = response.Results[0];

    offset += pagesize;
    resultList.AddRange(result.Data);

} while (offset < 30000);
resultJsonString = JsonConvert.SerializeObject(resultList);
return resultJsonString;      

This is part of C# code where it has SqlBulkCopy:

foreach (var record in data)
{
table.Rows.Add(new object[] { record.Detail.RecordNo, record.Detail.BatchKey, 
record.Detail.WhenModified });
}

using (var conn = new SqlConnection(connString))
using (var bulk = new SqlBulkCopy(conn))
{
    bulk.DestinationTableName = "tblSage";
    conn.Open();
    bulk.WriteToServer(table);
} 
Java
  • 1,208
  • 3
  • 15
  • 29
  • 1
    you can not get more efficiency than the database's maximum assigned resources. only if someone else has some magic that I am not a ware of – Maytham Fahmi May 23 '22 at 03:59

1 Answers1

1

You need to adjust the timeout value to allow for more time than the defaults.

bulk.BulkCopyTimeout = 60; // seconds

Reference:

NightOwl888
  • 55,572
  • 24
  • 139
  • 212
  • Is there any issue if I increase more than 60 seconds? What is ideal duration and its implication? – Java May 23 '22 at 17:08
  • 1
    No. The timeout is 30 seconds by default, which is clearly not enough for your case. You can set it as long as necessary for your scenario. But keep in mind that if your database doesn't respond, it will take until the end of the timeout to find out it is down so you also don't want to make the timeout too long. – NightOwl888 May 23 '22 at 17:16
  • What is implication if timeout is too long? I am trying to apply the maximum. – Java May 23 '22 at 23:42
  • 1
    Well, there is a connection timeout that can be very short, since opening a connection is quick. So, the only wasy a `BulkCopyTimeout` can occur are if the database or network go down while being connected, which should be very rare. The timeout is simply how long the client waits for the server to return. So, you can be fairly liberal if you are dealing with a reliable server and network connection. Azure no doubt has backup UPS and generators, redundant hardware and networks, so an outage isn't something you have to worry about. – NightOwl888 May 24 '22 at 05:17