Home > Enterprise >  C# How to insert bulk of data using List.AddRange more efficiently
C# How to insert bulk of data using List.AddRange more efficiently

Time:05-24

I am trying to insert data from a third party (Sage)'s Rest API to Azure SQL directly using C# code.

While I am trying to add about 30,000 rows of data, I got bottom error at line 66 where it has

resultList.AddRange(result.Data);

I only tried with three columns of data, and I cannot add more columns.

Currently, I am using 'basic' tier in Azure SQL database.

I am not sure what I could improve on C# code to make it more efficient (like using better way to add List).

Or should I just upgrade tier in Azure SQL database which has better DTU?

Or should I not covert (Serialize) JSON to Azure SQL, but insert in JSON format in Cosmos DB directly first and later move to Azure SQL?

Bottom is error message:

System.Data.SqlClient.SqlException
HResult=0x80131904
Message=Timeout expired.  The timeout period elapsed prior to completion of the operation or 
the server is not responding.
Source=Core .Net SqlClient Data Provider
StackTrace:
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean 
breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, 
Boolean callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ThrowExceptionAndWarning(Boolean 
callerHasConnectionLock, Boolean asyncClose)
at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, 
UInt32 error)
at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, 
SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject 
stateObj, Boolean& dataReady)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)
at Sage0413.Program.Main(String[] args) in C:\Users\source\repos\Sage\Main.cs:line 66

This is part of C# code where it has List.AddRange:

List<dynamic> resultList = new List<dynamic>();
do
{
    query.Offset = offset;
    query.PageSize = pagesize;

    Task<OnlineResponse> task = client.Execute(query);
    OnlineResponse response = task.Result;
    Result result = response.Results[0];

    offset  = pagesize;
    resultList.AddRange(result.Data);

} while (offset < 30000);
resultJsonString = JsonConvert.SerializeObject(resultList);
return resultJsonString;      

This is part of C# code where it has SqlBulkCopy:

foreach (var record in data)
{
table.Rows.Add(new object[] { record.Detail.RecordNo, record.Detail.BatchKey, 
record.Detail.WhenModified });
}

using (var conn = new SqlConnection(connString))
using (var bulk = new SqlBulkCopy(conn))
{
    bulk.DestinationTableName = "tblSage";
    conn.Open();
    bulk.WriteToServer(table);
} 

CodePudding user response:

You need to adjust the timeout value to allow for more time than the defaults.

bulk.BulkCopyTimeout = 60; // seconds

Reference:

  • Related