0

I'm building a candle recorder (Binance Crypto), interesting in 1 minute candles, including intra candle data for market study purpose (But eventually I could use this same code to actually be my eyes on what's happening in the market)

To avoid eventual lag / EF / SQL performance etc. I decided do accomplish this using two threads.

One receives the subscribed (Async) tokens from Binance and put them in a ConcurrentQueue, while another keeps trying to dequeue and save the data in MSSQL

My question goes for the second Thread, a while(true) loop. Whats the best approach to save like 200 + info/sec to SQL while these info come in individually (sometimes 300 info in a matter of 300ms, sometime less) using EF:

Should I open the SQL con each time I want to save? (Performance). Whats the best approach to accomplish this?

-- EDITED -- At one point I got 600k+ in the Queue so I'm facing problems inserting to SQL Changed from Linq to SQL to EF

Here's my actual code:

//Initialize 
        public void getCoinsMoves()
        {
            Thread THTransferDatatoSQL = new Thread(TransferDatatoSQL);
            THTransferDatatoSQL.Name = "THTransferDatatoSQL";
            THTransferDatatoSQL.SetApartmentState(ApartmentState.STA);
            THTransferDatatoSQL.IsBackground = true;
            THTransferDatatoSQL.Start();

            List<string> SymbolsMap;
            using(DBBINANCEEntities lSQLBINANCE = new DBBINANCEEntities())
            {
                SymbolsMap = lSQLBINANCE.TB_SYMBOLS_MAP.Select(h => h.SYMBOL).ToList();
            }

            socketClient.Spot.SubscribeToKlineUpdatesAsync(SymbolsMap, Binance.Net.Enums.KlineInterval.OneMinute, h =>
            {
                RecordCandles(h);
            });
        }

//Enqueue Data
        public void RecordCandles(Binance.Net.Interfaces.IBinanceStreamKlineData Candle)
        {
            FRACTIONED_CANDLES.Enqueue(new TB_FRACTIONED_CANDLES_DATA()
            {
                BASE_VOLUME = Candle.Data.BaseVolume,
                CLOSE_TIME = Candle.Data.CloseTime.AddHours(-3),
                MONEY_VOLUME = Candle.Data.QuoteVolume,
                PCLOSE = Candle.Data.Close,
                PHIGH = Candle.Data.High,
                PLOW = Candle.Data.Low,
                POPEN = Candle.Data.Open,
                SYMBOL = Candle.Symbol,
                TAKER_BUY_BASE_VOLUME = Candle.Data.TakerBuyBaseVolume,
                TAKER_BUY_MONEY_VOLUME = Candle.Data.TakerBuyQuoteVolume,
                TRADES = Candle.Data.TradeCount,
                IS_LAST_CANDLE = Candle.Data.Final
            });
        }

//Transfer Data to SQL
        public void TransferDatatoSQL()
        {
            while (true)
            {
                TB_FRACTIONED_CANDLES_DATA NewData;
                if (FRACTIONED_CANDLES.TryDequeue(out NewData))
                {
                    using (DBBINANCEEntities LSQLBINANCE = new DBBINANCEEntities())
                    {
                        LSQLBINANCE.TB_FRACTIONED_CANDLES_DATA.Add(NewData);
                        if (NewData.IS_LAST_CANDLE)
                            LSQLBINANCE.TB_CANDLES_DATA.Add(new TB_CANDLES_DATA()
                            {
                                BASE_VOLUME = NewData.BASE_VOLUME,
                                CLOSE_TIME = NewData.CLOSE_TIME,
                                IS_LAST_CANDLE = NewData.IS_LAST_CANDLE,
                                MONEY_VOLUME = NewData.MONEY_VOLUME,
                                PCLOSE = NewData.PCLOSE,
                                PHIGH = NewData.PHIGH,
                                PLOW = NewData.PLOW,
                                POPEN = NewData.POPEN,
                                SYMBOL = NewData.SYMBOL,
                                TAKER_BUY_BASE_VOLUME = NewData.TAKER_BUY_BASE_VOLUME,
                                TAKER_BUY_MONEY_VOLUME = NewData.TAKER_BUY_MONEY_VOLUME,
                                TRADES = NewData.TRADES
                            });
                        LSQLBINANCE.SaveChanges();
                    }
                }
                Thread.Sleep(1);
            }            
        }

Thx in Adv

Rafael

  • 1
    Don't use LINQ to SQL; it's ancient and deprecated. Don't use threads; you'll come unstuck very quickly. Don't worry about opening and closing connections; it's not what you think (read up on what connection pooling is). Don't worry about performance of inserting 300 items a second until you've actually proven it's a problem... – Caius Jard Jan 24 '21 at 13:42
  • Hi Caius, thx for the reply. I'm actually facing problems right know. At one point my Queue got 600k+ items, so the second thread is lagging to perform the insert operation. I'm Running MSSQL on a regular I5 Desktop with 16GB RAM. Any Suggestions? – Rafael Rizolli Jan 24 '21 at 13:53
  • `InsertOnSubmit` is jolly slow. Use `SqlBulkCopy` instead and fire it off every few seconds – Charlieface Jan 24 '21 at 14:32
  • I thought about doing that, but a few seconds on a 1 minute candle (for online purposes when and if I'm going to use it), it's like going blind – Rafael Rizolli Jan 24 '21 at 14:46

1 Answers1

0

I see one error in your code, you're sleeping a background thread after every insert, don't sleep if there's more data. Instead of:

if (FRACTIONED_CANDLES.TryDequeue(out NewData))
{
    using (DBBINANCEEntities LSQLBINANCE = new DBBINANCEEntities())
    {
        LSQLBINANCE.TB_FRACTIONED_CANDLES_DATA.Add(NewData);
        if (NewData.IS_LAST_CANDLE)
            LSQLBINANCE.TB_CANDLES_DATA.Add(new TB_CANDLES_DATA()
            {
                BASE_VOLUME = NewData.BASE_VOLUME,
                CLOSE_TIME = NewData.CLOSE_TIME,
                IS_LAST_CANDLE = NewData.IS_LAST_CANDLE,
                MONEY_VOLUME = NewData.MONEY_VOLUME,
                PCLOSE = NewData.PCLOSE,
                PHIGH = NewData.PHIGH,
                PLOW = NewData.PLOW,
                POPEN = NewData.POPEN,
                SYMBOL = NewData.SYMBOL,
                TAKER_BUY_BASE_VOLUME = NewData.TAKER_BUY_BASE_VOLUME,
                TAKER_BUY_MONEY_VOLUME = NewData.TAKER_BUY_MONEY_VOLUME,
                TRADES = NewData.TRADES
            });
        LSQLBINANCE.SaveChanges();
    }
}
Thread.Sleep(1);

Change the last line to:

else
    Thread.Sleep(1);

This may resolve your problem.

Lacutah
  • 276
  • 1
  • 7
  • Also, totally could see this potentially benefiting from utilizing a more bare-metal bulk insert instead of one at a time since you're not manipulating the inserted data / don't need to track it. Obviously little benefit if only one pending record at a time, but still would remove the EF object tracking. I use https://github.com/borisdj/EFCore.BulkExtensions for this sort of task. You could rewrite the background thread to dequeue ALL pending data and insert in one statement each time to get rid of / alleviate the overhead of EF entity tracking with your current approach. – Lacutah Jan 24 '21 at 21:12
  • That really helped! Thx Changed to bulk solves the deal. Just grab everything even if everything is just 1 or 1.000 – Rafael Rizolli Jan 25 '21 at 00:02