I am trying to run an Ansible playbook against an EC2 instance in a private subnet, but I am having trouble.
I tried running this workflow file:
steps:
- name: Checkout
uses: actions/checkout@v3
- name: terraform ssh key
uses: webfactory/ssh-agent@v0.7.0
with:
ssh-private-key: ${{ secrets.EC2_PRIVATE_KEY }}
- name: Run playbook
uses: dawidd6/action-ansible-playbook@v2
with:
playbook: deploy.yaml
directory: ./
inventory: |
[all]
10.2.13.227
This is my playbook:
- hosts: all
vars:
- ansible_ssh_user: "admin"
- ansible_ssh_common_args: >
-o ProxyCommand="ssh -W %h:%p -q ec2-user@13.37.97.97" -i ~/keys/fr-prod.pem \
-o ServerAliveInterval=5 \
-o StrictHostKeyChecking=no
tasks:
- name: example
debug:
msg: Hello World from {{ inventory_hostname }}!
I expected this to connect to by private instance and show me this message:
Hello World from 10.2.13.227
Instead, I got this error:
> fatal: [10.2.13.227]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: kex_exchange_identification: Connection closed by remote host\r\nConnection closed by UNKNOWN port 65535", "unreachable": true}
I think this error is caused by the SSH key being forwarded incorrectly. How can I fix this?