0

I run a small socket.io node.js app that always has a small number of concurrent users. For the most part, it can run 100% of the time on a single process (whether it be on my Linode or Heroku[1]).

I'm hoping to publicize it a bit for a few days, during which I'll expect a decent amount of traffic. This means concurrent websocket/xhr-polling connections. However, the two socket.io connections do not have to communicate with each other.

The app was running on my Linode for quite some time, but I've recently transfered it over to a single free heroku dyno. Even running with xhr-polling (heroku doesn't support websockets natively) it is very fast.

I'm curious how I can utilize my Linode to essentially run and load balance the app. But if it starts to slow down, push users over to the heroku dyno running (or vice versa). This should be possible with something like node-http-proxy, but I had a tough time getting it running properly. Hopefully, I could fine tune it at which it could hit a point, or I could flick a switch and it would begin balancing users over to the second site.

Basically, I'm just looking to be pointed in the right direction. Code would be nice, but not necessary. The solution doesn't even have to be in node.

Thank you!

[1] I've tried out nodejitsu as well for quite some time, and just didn't find it ready for production usage.

Edit: Should I just use a simple roundrobin, as explained here? What if I want to favor one over the other?

Community
  • 1
  • 1
switz
  • 24,384
  • 25
  • 76
  • 101
  • Since Heroku already has an effective load balancer in place, why not just use them for the production app host? Their setup will allow you to easily and quickly scale up if you wind up with more traffic than expected. – redhotvengeance Jan 07 '13 at 07:56
  • Because a second dyno costs $34 and I already own a perfectly good Linode for at least a month. I'm a student. :) - Plus, I'm not sure Heroku would even handle socket.io on multiple dynos properly. – switz Jan 07 '13 at 07:57
  • This is what [Cluster](http://nodejs.org/api/cluster.html) is for, but it's still in an experimental state. – mekwall Jan 07 '13 at 08:52
  • Cluster is for forking a process on one server. Here I want to load balance two separate processes on two separate servers. – switz Jan 07 '13 at 08:55
  • @switz Cluster load balances requests over multiple Node.js processes. Title is a bit misleading so I suggest that you change it. – mekwall Jan 07 '13 at 09:20
  • Do you control the http server? Nginx works really nicely as a reverse proxy/load balancer, see http://nginx.org/en/docs/http/ngx_http_upstream_module.html – myanimal Jan 07 '13 at 09:50
  • I've had some problems with nginx + websockets. Even with all of the workarounds that exist out there. – switz Jan 07 '13 at 16:48
  • Are you using it with derbyjs? Just curious. – Juzer Ali Jan 07 '13 at 18:05
  • This looks like a job for HAproxy to me. Nginx could be an option also. – nha Mar 19 '14 at 18:55

1 Answers1

0

You can use Foreman for this type of job:

Procfile

web: node app.js

Then simply invoke this in your command line:

$ foreman start -c web=4

This will create 4 instances of your node app listening on different ports.

08:29:41 web.1  | started with pid 2576
08:29:41 web.2  | started with pid 1968
08:29:41 web.3  | started with pid 2712
08:29:41 web.4  | started with pid 5280
08:29:42 web.3  | Express server listening on port 5002
08:29:42 web.1  | Express server listening on port 5000
08:29:42 web.2  | Express server listening on port 5001
08:29:42 web.4  | Express server listening on port 5003

You can then just use nginx and set those to your upstream.

Jürgen Paul
  • 14,299
  • 26
  • 93
  • 133