1

I'm writing trading bot on ruby and I need to constantly do some calculations based on Exchange's orderbook depth data across several daemons (daemons gem).

The problem is that right now I'm fetching data via Exchange's API separately in each daemon, so I ran into API calls limit (40 requests/second). That's why I'm trying to utilize ruby Drb to share orderbook data across several processes (daemons) in order to not sending unnecessary API calls.

Although I m not sure how to constantly consume API on the server side and provide the latest actual data to the client process. In the sample code below, client will get data actual at the moment when I started server

server_daemon.rb

require 'drb'    
exchange = exchange.new api_key: ENV['APIKEY'], secret_key: ENV['SECRET']
shared_orderbook = exchange.orderbook limit: 50
DRb.start_service('druby://127.0.0.1:61676', shared_orderbook)
puts 'Listening for connection… '
DRb.thread.join

client_daemon.rb

require 'drb'
DRb.start_service
puts shared_data = DRbObject.new_with_uri('druby://127.0.0.1:61676')
Ilya Cherevkov
  • 1,743
  • 2
  • 17
  • 47
  • 1
    Instead of drb, I'd use redis and its pub/sub (or similar solution). One daemon consumes the api and pushes updates to pubsub channel. Other daemons listen to that channel, get updates and act accordingly. – Sergio Tulentsev Apr 25 '15 at 16:17
  • @SergioTulentsev thanks I ll give it a try. Do you know any performance implications of using Redis pub/sub? – Ilya Cherevkov Apr 25 '15 at 16:35
  • 1
    Redis is crazy fast. If you manage to not screw it up, your app will be fast as well. :) By that I mean, for example, pushing 3 kilobytes of serialized ruby object when you only need 50 bytes of data. Don't know what kind of data/objects you'll be working with, but hope you get the idea. – Sergio Tulentsev Apr 25 '15 at 16:39
  • I only need 100-150 items in key/value hash each 10-20ms – Ilya Cherevkov Apr 25 '15 at 16:54
  • This should be fine. – Sergio Tulentsev Apr 25 '15 at 17:09

0 Answers0