9

We have around 2500 Linux servers. We have a Jumpstart server, from which we can SSH to any server for system administrator related tasks. We have deployed a single private key and have deployed the matching public key to all servers, but this is a huge security threat.

We want to deploy different key for each server, but that would create a large number of keys and making it difficult to manage. Please suggest the correct way to handle many servers and keys.

fatal_error
  • 1,152
  • 1
  • 11
  • 18
Akshay
  • 123
  • 1
  • 2
  • 5
  • 3
    Configuration management tooling. – user9517 Jan 04 '17 at 17:59
  • 1
    IdM, IPA, Kerberos, hell, even Facebook's SSL-Cert-SSH would do the trick... – John Keates Jan 04 '17 at 22:31
  • You can use literally any tool, which can copy a authorized_keys file. And better copy the whole file instead of scripted adding/removing, so you exactly know what's in the file by looking at the source file (i.e. some ``lineinfile`` macro would not remove unused old keys). – allo Apr 05 '17 at 17:46
  • Have a look on how Netflix manages the same kind of problem: https://www.oreilly.com/learning/how-netflix-gives-all-its-engineers-ssh-access with presentation at https://speakerdeck.com/rlewis/how-netflix-gives-all-its-engineers-ssh-access-to-instances-running-in-production and opensource code at https://github.com/Netflix/bless – Patrick Mevzek Aug 05 '18 at 21:05

6 Answers6

5

FreeIPA does SSH key management pretty well. I use it successfully at home but many businesses use it in production (the freeipa-users mailing list is hyper-active). It's the upstream free software project on which is based Red Hat's Identity Management solution. There are also Red Hat devs moderating and helping on the freeipa-users mailinglist.

Basically it's an Active Directory-like service for Unix/Linux environments. It can also sync with AD. It has *nix native features like NFS automounts, centralized sudo policies, etc.

Each user can add its SSH key to its FreeIPA profile. Then sshd can be configured to use an AuthorizedKeysCommand provided by the sssd package which would be configured to query the FreeIPA server. Combined with sudo policies, you get privilege escalation and audit (who sudo'ed).

Being a RedHat project, it is a one-liner to install on Fedora or CentOS, but I've successfully installed and configured debian boxes as FreeIPA clients. I install the authentication server on a Fedora Server though, which is its natively supported platform.

http://www.freeipa.org/page/Main_Page

  • FreeIPA doesn't really work in the cloud because it requires Kerberos and manageable DNS (reverse hostnames, static IPs, etc). It can work fine in a highly static corporate datacenter, but it falls down when dealing with an even slightly dynamic cloud environment, especially because you can't control reverse DNS on most clouds. – fatal_error Jul 04 '20 at 01:57
4

The problem I see with using the same SSH key everywhere is the lack of accountability.

I would, instead, let each user have their own SSH key or keys, and use a centralised authentication and authorisation system.

This way, public keys do not even need to be distributed to the target systems, and can be stored in a central directory service like FreeIPA.

In addition to gaining accountability, you also have the capability of defining fine-grained Role-Based Access Control (RBAC) policies.

dawud
  • 15,096
  • 3
  • 42
  • 61
3

For 2500 hosts, you may already have a configuration management system, but you could use SaltStack if you don't. We have this for root auth:

user1auth:
  ssh_auth:
    - present
    - user: root
    - source: salt://resources/ssh_keys/user1

user2auth:
  ssh_auth:
    - present
    - user: root
    - source: salt://resources/ssh_keys/user2

You don't need to have the private key on the jump host. Just use agent forwarding when logging in: ssh -A root@host.

There are other system too (Pupet, cfengine, for instance).

Halfgaar
  • 8,084
  • 6
  • 45
  • 86
  • I agree, Salt is a great way to go, but there's an abstraction mismatch between the server configuration and the one-to-many users on each server. Userify works well with Salt (as well as Ansible, Chef, Puppet, etc), though and focuses on users with a nice web dashboard for your users to update their keys and manage authorization, while your cfg mgr handles the server itself and installing Userify. – fatal_error Jul 04 '20 at 02:00
3

Most security frameworks (ie HIPAA, PCI, etc) require one key per human user. Users should never share keys any more than they would share passwords.

Each human user needs their own account, and those accounts should be removable across all servers in your entire enterprise when people move on to other projects. This is a task that begs for automation!

I work for Userify, which manages the keys for you across your teams and all of your servers and can even interface with Active Directory (Userify Enterprise), but there are other options as well, like SSH Universal Key Manager.

Your choice should depend on what your needs and budget are, and the features that you find important.

For example, many centralized systems are so centralized to the point that you might not be able to log into any of your servers, if, say, your LDAP server is down. (Userify will continue to operate properly even if AD or LDAP is down, because the layers of public key cryptography extend all the way down to the end server.)

It's important to be able to manage your SSH permissions centrally, while decentralizing the actual operation for greater reliability and control.

See also this Slant topic: https://www.slant.co/topics/8010/~managers-for-ssh-keys

fatal_error
  • 1,152
  • 1
  • 11
  • 18
1

Copying the public key to all servers you are logging into is generally the way things are done with SSH. If what you've done is create a private/public keypar on the jumpstart server and copied the public key to ~/.ssh/authorized_keys file on the host you want to ssh into, then that is (partially) the accepted way of securing a remote login via ssh. Other things to consider to further secure ssh are:

  • change /etc/ssh/sshd_config and set PasswordAuthentication No and PubkeyAuthentication yes
  • change LogLevel in /etc/ssh/sshd_config to VERBOSE and then monitor /var/log/auth.log for anomalies.
  • edit /etc/security/access.conf to allow only logins by user from certain IP
  • edit /etc/pam.d/login and set account required pam_access.so to enable the changes you made to access.conf

Deploying a "different key for each server" seems to mean that you want a different public key in the ~/.ssh/authorized_keys on each server. The reason for this is unclear to me but if you must, I would create a text file on the Jumpstart server with a host on each line. Use a for loop to parse the file and run ssh-keygen -f $line_from_file to create a key pair for each host. You could also use the same for loop to ssh into each host and add the pubkey to ~/.ssh/authorized_keys using sed -i or something.

Server Fault
  • 3,714
  • 12
  • 54
  • 89
-2

If you have no budget constraint go for proprietary Solution like Dell TPAM or Cyberark (I don 't like none of them)

Otherwhise you can use one SSH key for each user with a ssh-agent or gpg-agent