1

I've been planning out a Rails RSS aggregator lately, and I ran into something I could use some advice on. The part that would handle the polling and parsing of users' subscribed feeds needs to be running constantly, which I assume a daemon is probably the best option for. (I would use the Daemons gem and have the daemon periodically query the database for feeds in need of refreshing, then use Feedzirra to parse and save items.)

My question is: how would the daemon share the models and migrations from Rails, especially if the daemon were running on another server, should the app require it for scalability? (i.e. database server, feed crawler server, and instances of the front-end) I'm probably falling victim of "premature scaling," but as a Ruby newbie I'm interested in what the best way to handle this would be. For the sake of "doing it the right way" the first time.

Or am I going about this the wrong way?

redwall_hp
  • 1,067
  • 2
  • 10
  • 18
  • 1
    Consider using delayed job for your background proccessing, it loads your entire rails app, so all application code is available, https://github.com/collectiveidea/delayed_job - also railscasts has great tutorials on other options, http://railscasts.com/?tag_id=32 – house9 Jul 02 '13 at 05:43
  • I'll probably look at the Railscast for Delayed Job, since it seems to be designed with this sort of thing in mind. This part in particular sounds like what I'd need at first glance: https://github.com/collectiveidea/delayed_job#running-jobs – redwall_hp Jul 03 '13 at 21:39

1 Answers1

1

As @house9 pointed out you should use DelayedJob for this (https://github.com/collectiveidea/delayed_job)

DJ is loading whole Rails env and is capable of running as a separate process even on separate server. That's the easiest way to go.

Mike Szyndel
  • 10,461
  • 10
  • 47
  • 63