0

I'm going to run a bit of software on server A to add and remove entries from elasticsearch on server B.

How can I allow these remote create and delete operations, but still keep ES secure? Is IPtables the right route or is there a better solution?

Jimmy
  • 269
  • 4
  • 7
  • 23

2 Answers2

1

Usual solution (as far as I know and certainly what I've implemented) is combination of iptables (to restrict direct access) and then providing access via reverse proxy in Apache, Nginx, your web server of choice. Web search for "securing elasticsearch" brings up plenty of relevant stuff.

Depends exactly what you're tryng to do whether reverse proxy is appropriate. We use this with kibana and shibboleth to restrict query access - means that we're using our standard access control stuff. Data is fed in by logstash - that's got direct access allowed in iptables - doesn't go through Apache.

It does look like the state of the art has moved on significantly since we set up our cluster though (see for example http://www.elastic.co/guide/en/shield/current/architecture.html)

Paul Haldane
  • 4,517
  • 1
  • 21
  • 32
  • What is the benefit of going through nginx as a reverse proxy as opposed to accessing the ES port directly? – Jimmy Apr 15 '15 at 15:00
1

If you have the $, then you can get the Shield software for Elasticsearch. This will allow you to use PKI certificates for authentication and authorisation to the ES cluster. Combined with iptables to restrict access, this should be everything you need.

If you're trying to do it for free, then use Apache as a reverse proxy in front of ES to place SSL over the top with some sort of authentication - either Basic user/password, or Certificate-based if you want to be really secure. Again, iptables can block direct access to ES from outside and you can use the Apache access rules or iptables to block unauthorised IPs.

Steve Shipway
  • 740
  • 5
  • 17