0

I want to insert the data in database received from WebSocket but I'm getting a delay while inserting. I'm using WebSocketSharp() for WebSocket the data I'm getting in JSON Format. The hit count of WebS.OnMessage +=(s1,e1)=> method in 550 times in 1 sec and the data received form websocket is single line getting single lines 500 in 1 sec and that need to be insert less then a 1sec or in millisecond.

data Sample->

{"seg":"BSE","currc":"0.012","code":"538610","ltt":"1683171716"}
WebSocket WebS =new WebSocket("SocketLinkHere");
WebS.OnOpen+=(s1,e1)=>
{
//no need
};
WebS.OnError +=(s1,e1)=>
{
    if(!WebS.IsAlive)
    {
        WebS.close();
        WebS.Connect();
    }
};
WebS.OnClose +=(s1,e1)=>
{
    //errorlist here
};
WebS.OnMessage +=(s1,e1)=>
{
    try
    {
        string result="["+e1.Data+"]";
        SQLEXE(result);//call method to insert result into database
    }
    catch(exception ex){}
};

WebS.Connect();


private void SQLEXE(string result)
{
    try
    {
        DataTable dt=JsonConvert.DeserializeObject<DataTable>(result);
        SqlParameter[] para={
        new SqlParameter("@ltt",dt.Rows[0]["ltt"]),
        new SqlParameter("@code",dt.Rows[0]["code"]),
        new SqlParameter("@currc",dt.Rows[0]["currc"]),
        new SqlParameter("@seg",dt.Rows[0]["seg"])
        };
        cmd.Parameters.Clear();
        cmd.Parameters.AddRange(para);
        cmd.executeNonQuery
    }
    catch(Exception error){}
}

I need to insert that data in less then 1 sec in database. Table structure

  • 1
    ~500 per second, means about 2ms per insert. Which isn't bad perf given there will be a round trip overhead to the database. Consider batching inserts to reduce overheads. – Richard May 04 '23 at 07:03
  • How much milliseconds does that executeNonQuery call take? – rene May 04 '23 at 07:32
  • when we start it will work properly after a min it well take some time and that delay get increase up to 10 or 15 min. – Gaurav Mali May 04 '23 at 07:35
  • Have you tried to specify the correct data types and lengths on the SqlParameters to match the data types and lengths of the related columns in the table, or the parameters of the stored procedure you're calling? If you don't then it can create a new query plan for each request (and query compilation takes time) and eventually blow out the plan cache because you won't be getting any plan reuse. – AlwaysLearning May 04 '23 at 07:56
  • There are minor problems in this code but none of it explain delays like that. We need to know what that query does, what the tables and columns look like, what indexes are present and what the query plan shows for whatever happens on the SQL Server in that ExecuteNonQuery call. – rene May 04 '23 at 08:02
  • Check the memory usage on your app, maybe it's getting delayed by gc? as someone else suggested, i would probably run the inserts in separate thread, batched by some number of rows. since otherwise the overhead is quite large. – siggemannen May 04 '23 at 08:38
  • @siggemannen I also try that way to run this SQLEXE on Different Thread but face some error at run time – Gaurav Mali May 04 '23 at 08:45
  • @rene There is a stored procedure which has only insert query in it. – Gaurav Mali May 04 '23 at 08:47
  • when you get such delays, what do you see in SSMS Activity monitor? you can click on query and select query plan to see if something is strange – siggemannen May 04 '23 at 08:48
  • so it inserts in a single table? No indexes? No triggers? Any other readers / writers for that table? What are the columns for that table? Where is that sql server on the network? Is it close to you or at the other end of the world? Show the insert statement just to be sure you're not overlooking something. – rene May 04 '23 at 08:53
  • @siggemannen ok but, is it possible to insert 500 rows in milliseconds one by one? – Gaurav Mali May 04 '23 at 08:53
  • Given you're I/O bound, it depends. – rene May 04 '23 at 08:54
  • @rene The sql-server setup on local network LAN – Gaurav Mali May 04 '23 at 08:56
  • I'm still interested in the answers on my other comment. – rene May 04 '23 at 09:00
  • it's tough, but possible, but i don't think it's possible using non-bulk techniques. we do around 20 mils rows per day which is around 50% of your kind of speed, but have far more columns without problems – siggemannen May 04 '23 at 09:01
  • @rene I added Table Structure in post please check – Gaurav Mali May 04 '23 at 09:03

0 Answers0