1

I'm very new with Redis and Node.js, but I would like to enhance my web application performance using Redis, and adding a realtime notifications feature using Node.js.

Now, I have added 1 EC2 instance to serve both process, bind it with an elastic IP address, and its associated subdomain. However, I read in Redis website that a machine that running Redis should not be exposed to untrusted environment.

From http://redis.io/topics/security

Redis is designed to be accessed by trusted clients inside trusted environments. This means that usually it is not a good idea to expose the Redis instance directly to the internet or, in general, to an environment where untrusted clients can directly access the Redis TCP port or UNIX socket.

In other hand, the Node.js which listen to websocket protocol must be exposed to the internet so my web application can have realtime interaction with it.

So my first question is, do I really need 2 EC2 instances, each for Redis and Node.js?

My second question is: What is the best way to keep my existing PHP session and have the Node.js recognizes the just-loggedin-user using PHP session?

I almost modify my PHP session, change it from in-table session to Redis session, until I read the guide from Redis website above.

Thank you.

robotijo
  • 41
  • 1
  • 4
  • 1
    Make your Redis instance listen on loop back (`127.0.0.1` or `::1`) or even better a UNIX socket (these exist in the filesystem and their access can be protected by file permissions) rather than the external IP address. This solves your main problem. – Dan D. Nov 20 '12 at 08:23
  • I've uncomment bind 127.0.0.1 in redis.conf, restart redis-server and then listen from nodejs client. Using predis as other client for PHP which is installed remotely from redis instance, I try to publish a message, and nodejs client succesfully recognize the message. Does it mean that my redis instance is already isolated from outside but still able to get the published message? thank you – robotijo Nov 20 '12 at 10:35

2 Answers2

1

In the main configuration file /etc/redis/redis.conf, there should be a bind 127.0.0.1 (if not commented, add it), then save and restart redis. This directive tells the service to listen only to connections from localhost. And it should be sufficient enough to keep the redis installation from the outside world. And if I remember correctly, by default the outside world is denied by default with EC2.

As far as your second question, you need to make sure that the session id is consistently read between the two languages. session_id() will give you the id for the session in php. You just need to setup node to read the PHPSESSID cookie. You just need to make sure that the cookie will be sent to the node.js service under whatever hostname it's listening to in the browser. In php, you change it in the php.ini or with session_set_cookie_params before the session_start().

Julian Lannigan
  • 6,492
  • 3
  • 18
  • 12
  • one thing that I couldnt find the answer yet, after I commented out the bind 127.0.0.1, the redis indeed refuse all remote connection. How am i suppose to allow my web-server on different machine to publish a data to it then? I already use npm http-proxy and bouncy, but none of them works. There's also redis-proxy, but it's in alpha version and doesnt support pub sub feature. thanks – robotijo Nov 21 '12 at 02:04
  • To do that, I would use iptables to handle the traffic. I would just make sure that when you open the port, you only do so for the given machines. https://help.ubuntu.com/community/IptablesHowTo – Julian Lannigan Nov 21 '12 at 06:20
0

You'll want to do as the commenter said and listen on 127.0.0.1 and then you can for added security just not open the port to the outside world. This should be done by default in AWS because of the security group feature. In order to open a port to the outside world you have to explicitly add it to the security group.

Here's a little more info on security groups. http://docs.amazonwebservices.com/AWSEC2/latest/UserGuide/using-network-security.html

Charlie Key
  • 3,440
  • 1
  • 21
  • 12
  • If I take a conclusion, my application will have 1 web-server machine, and then 1 server machine to run Redis + Node.js + Socket.io, right? The last server will be exposed to internet for a particular port that will be used by Node.js but keep the Redis port isolated, only listens to 127.0.0.1. Is that correct? CMIIW, thanks – robotijo Nov 20 '12 at 17:17
  • Yes that would work absolutely fine. Especially at a small to medium size site. – Charlie Key Nov 20 '12 at 19:43