4

I have implemented SSH CA client signing on my servers. Sshd is configured on my servers with the following directive:

TrustedUserCAKeys /etc/ssh/trusted-users-ca.pem

I modified my local ssh config file so my cert is sent as well, when I connect to my servers:

Host *.internal.headincloud.be
        User centos
        IdentityFile ~/.ssh/datacenter-hic-deploy
        CertificateFile = ~/.ssh/datacenter-hic-deploy-cert.pub

This seems to work just fine, and I'm able to connect to my server without the need to deploy an authorized_keys file.

However, Ansible is unable to connect my servers:

TASK [Gathering Facts] *********************************************************************************************************************************************************************
fatal: [postgres-01]: UNREACHABLE! => {"changed": false, "msg": "SSH Error: data could not be sent to remote host \"192.168.90.40\". Make sure this host can be reached over ssh", "unreachable": true}

Like I already mentioned, I'm able to connect via ssh just fine.

I suspect Ansible is not sending the certificate file along, and that's why I am unable to connect.

I tried modifying my ansible.cfg as follows:

ssh_args = -C -o ControlMaster=auto -o ControlPersist=60s -i ~/.ssh/datacenter-hic-deploy-cert.pub

or

ssh_args = -C -o ControlMaster=auto -o ControlPersist=60s -i /Users/jeroenjacobs/.ssh/datacenter-hic-deploy-cert.pub

Neither of those work.

I cannot a find a way to tell Ansible how to do this. Anyone an idea?

Jeroen Jacobs
  • 1,386
  • 3
  • 16
  • 25
  • 2
    At first glance Ansible seems to connect to a host named `192.168.90.40` but your ssh config is set up for hosts using host names ending in internal.headincloud.be , not ip-addresses ; possibly you need to ensure that Ansible connects using the correct host name in the ssh connection rather than the ip-address – HBruijn Sep 18 '19 at 21:00
  • 1
    Ah yes, I had a mismatch in my inventory file, and I was using ip addresses there! Fixing this to hostnames has fixed the issue. Can you put this as an answer, instead of a comment? – Jeroen Jacobs Sep 19 '19 at 09:12

2 Answers2

3

fatal: [postgres-01]: UNREACHABLE! => {"changed": false,
"msg": "SSH Error: data could not be sent to remote host \"192.168.90.40\".
Make sure this host can be reached over ssh", "unreachable"

At first glance Ansible seems to connect to a host named 192.168.90.40 but your ssh config is set up for hosts using host names ending in *.internal.headincloud.be , not ip-addresses.

Check your inventory, possibly you need to ensure that Ansible connects using the correct host name in the ssh connection rather than the ip-address, or you will need to make a second stanza in your ~/.ssh/config matching the ip-addresses you're using.

HBruijn
  • 77,029
  • 24
  • 135
  • 201
0

You can make Ansible use an arbitrary private key by setting the ansible_ssh_private_key_file variable. The best place to set this variable depends on which servers the key needs to be used with. If it's every server, then you could do something like this:

$ cat group_vars/all/sshkey 
ansible_ssh_private_key_file: /Users/jeroenjacobs/.ssh/datacenter-hic-deploy
Michael Hampton
  • 244,070
  • 43
  • 506
  • 972
  • Please provide more info, as I don't understand why my naming convention is wrong. "datacenter-hic-deploy" is the private key, "datacenter-hic-deploy.pub" is the public key, and "datacenter-hic-deploy-cert.pub" is the ssh ca-signed certificate. – Jeroen Jacobs Sep 19 '19 at 04:54
  • @JeroenJacobs OK, then that doesn't apply. I misread the question. – Michael Hampton Sep 19 '19 at 11:29