5

I have what seems to be a growing count of EC2 instances and all is running fine and dandy. The one problem I'm facing, however, is figuring out a strategy for SSHing between the machines. Copying over my private key to each instance is counter productive, and it works fine when I need to SSH in from my personal machine, but not when I need to SSH from machine to machine.

What are some decent strategies to tackle this problem? How are you SSHing in between your cluster of EC2 instances?

imaginative
  • 1,971
  • 10
  • 32
  • 48

2 Answers2

6

You use ssh-agent:

ssh-agent
ssh-add
ssh -A remote-machine

For easier use, add

Host remote-machine
ForwardAgent yes

to your ~/.ssh/config

Hubert Kario
  • 6,361
  • 6
  • 36
  • 65
0

Well given all my EC2 instances are started using the same SSH key for root I only need to load one key into the SSH agent. That said I don't typically SSH in as root as a matter or policy, so my EC2 instances fire up and connect to a Puppet server which then configures them including installing any applications along with setting up user accounts and their respective SSH identity keys and establishing sudo permissions. Then I just load my personal user private SSH identity into ssh-agent on my laptop from it's LUKS encrypted USB drive. The root SSH identity key put in place during the EC2 instance initialization is then just there as a backup.

The other advantage of using Puppet is I can use the puppet server as a jump box into any of the instances and actually have Puppet update the system's ssh_known_hosts and /etc/hosts file automatically for me.

Jeremy Bouse
  • 11,341
  • 2
  • 28
  • 40