0

I am working on standardizing different bits in our environment and as part of that would like to move towards ssh key based authentication.

Currently we have individual accounts in each servers (around 150-200 of them) and thankfully we keep the uid/gid/username same across most of these servers.It becomes difficult to add/remove users in each host when someone joins or leaves the firm. Generally, there is a software account for each team and members of the team ssh to the servers as a software account by entering the accounts password.

To ease our administration, I will be introducing puppet in the environment and want to use ssh_authorized_key puppet resource to update the software accounts authorized key file with the public key of the users.

In my understanding, this will be like:

ssh_authorized_key { 'Ram': user => '', type => 'ssh-rsa', key => '', }

ssh_authorized_key { 'Shyam': user => '', type => 'ssh-rsa', key => '', }

So, how do I make sure this public key is same across all the servers for a particular user? For example, if the user 'Ram' tries to ssh to server 'hostA' from any server, the same public-private key pair should be used.

Any pointers on how can I have that maintained? Please let me know if something isn't clear. Thanks in advance.

-Ram

Ram Kumar
  • 73
  • 1
  • 3
  • 8
  • 2
    Why aren't you using LDAP or something similar to create accounts and manage access of users to servers? 200 is too much for one by one authentication. Also, in my experience salt is better in this type of orchestration. I recently did research for this kind of administration of a thousand servers, and salt was the winner, so I adopted it. – bayindirh Sep 28 '17 at 11:19
  • Kerberos also is an option here if you want to keep it specifically for authentication (as opposed to authorization). – TheFiddlerWins Sep 28 '17 at 12:13
  • I second the first comment: Use central auth, don't create accounts per machine. Puppet and salt and what ever are great, but not the right tools for this job. – gxx Sep 28 '17 at 14:32
  • I agree with the comments. Setting up LDAP is also in my list but for now, we want to go ahead with ssh_keys. – Ram Kumar Sep 29 '17 at 05:46

3 Answers3

1

You need to use SSH Agent forwarding. Basically you make a file ~/.ssh/config, where you add:

Host host
  ForwardAgent yes

This tells SSH that it will pass authentication information from the current session to the next server. Therefore the user's private key is kept to himself, and he uses it to connect to the first server.

For a more complete guide, look at https://developer.github.com/v3/guides/using-ssh-agent-forwarding/.

Tero Kilkanen
  • 36,796
  • 3
  • 41
  • 63
0

Do you have a copy of all your user's public keys? It should be relatively trivial to write a script that runs ssh-copy-id for each user's credentials.

TheFiddlerWins
  • 2,999
  • 1
  • 15
  • 22
  • No, users are yet to generate them. Once they generate the key pair (using ssh-keygen -t ) and give me the public keys, I can then add it to the puppet ssh_authorized_key resource, so that I don't have to run ssh-copy-id each time. But the question is will that public-private key pair work across all the servers for a particular user? – Ram Kumar Sep 28 '17 at 13:02
  • It will, it was not uncommon to have users have their home directories mounted via NFS and their public/private keys *both* there. Of course this means that anyone that can access the user's home directory can login as that user - encrypt the private keys and use NFSv4 + Kerberos if you want this secure. But there is nothing linking a pub/private key to a specific host. – TheFiddlerWins Sep 29 '17 at 15:14
0

You will need to make sure that user is created on every machine the user intends to SSH into.

On the other hand, keys are not specific to users. You just need to add a public key to whichever users '~/.ssh/authorized_keys' they intend to ssh in as. You can add the same key to root, ec2-user, 'joe', etc..

  • Thanks for the response. For example, let's assume that users named 'Joe' and 'Jerry' wants to ssh to 'n' number of servers as 'software' user. Using puppet, I will ensure that 'software' user is created on all 'n' number of servers. I can also ensure that I add 'Joe' and 'Jerry's' public key to '/home/software/.ssh/authorized_keys' file. But the question I have is, if Joe and Jerry had to ssh from 100 different servers (where Joe and Jerry are local accounts with same uid,gid,username), then how can I keep the private key that Joe and Jerry use same across all the 100 different servers? – Ram Kumar Sep 28 '17 at 14:22
  • Oh. I think the short answer there is "dont do that". In general you want to keep your private key in as few places as possible. Perhaps set up a "bastion host" that everyone connects to first that then has your private keys on it. – Daryl Metzler Sep 28 '17 at 14:31
  • I understand, that will be a last option if we don't find any other alternative. – Ram Kumar Sep 29 '17 at 05:44
  • I think the option presented by Tero Kikanen can work. Key Forwarding. Is that an option? You need it configured on each _client_ – Daryl Metzler Sep 29 '17 at 14:05