0

Right now I have a word game where users are randomly paired up and send messages to each other. I am using Node.js and Socket.io. Basically, I'm storing the socket.id as the key for an object of users. One of the properties is the socket.id of their partner. So when someone sends a message, I have logic to find that user and get their partner socket.id, and then I send the message to that specific socket.id.

However, this is not going to work when I have multiple instances of Node behind NGINX or some other loadbalancer. I've looked into using a tool such as Redis, however every example I've found involves sending the same message to users across instances. I am not sending the same message to thousands of people across multiple instances. I am only sending messages between two people, however many of these little chatrooms need to exist. What if two users can't get paired because they are on different instances of Node? Even then, I feel like I shouldn't be sending every message to every instance of Node and doing nothing with it on all instances except the one the partner is connected to.

Any information on tackling this kind of problem would be greatly appreciated.

Genoe
  • 13
  • 4

1 Answers1

0

Here's one approach. Not sure if it's the best, but it seems pretty simple and reliable.

The basic idea is that every user is attached to one server. Like, user Alice has a websocket connection to Server 1, and user Bob has a connection to Server 2.

Then you have a central table (perhaps in Redis, or whatever you're going to use for account management) which says which users are on which servers. You change this whenever a user connects or disconnects. If they're allowed to connect multiple times at once, then the table needs to allow for that, but let's assume not, for now.

So, to get a message from Alice to Bob, the process is:

  • Alice sends it to Server 1.
  • Server 1 looks up the server for Bob, sees it's Server 2
  • Server 1 sends it to Server 2
  • Server 2 sends it to Bob.

It sounds like in your situation, you'd want to keep the connection between Server 1 and Server 1 open. Maybe you even want to have connections open between every pair of servers. N-squared connections sounds like a lot, but it's only N per server so it's not bad.

Not sure what size servers you're using, but I'd expect a few thousand websocket connections to be fine (maybe even 10k), as long as the traffic is pretty low, as you describe. So I think this would probably scale to a million users without much problem.