I want to deploy sshfs on 2 machines using ansible. I have the following hosts configuration:
--- # hosts
...
cloud:
children:
cloudalpine:
hosts:
web-alpine-1:
ansible_host: Public IP
web-alpine-2:
ansible_host: Public IP
In my playbook, I want to mount an sshfs volume, but I can't seem to only reach the OTHER nodes private IP address.
The final configuration I want to achieve is:
web-aplpine-1 > web-alpine-2:/etc/docker/storage
web-alpine-2 > web-alpine-1:/etc/docker/storage
--- # Playbook
...
# I want to create a playbook with an sshfs volume mount
- name: Volume mount for docker
docker_volume:
volume_name: "sshvolume"
use_ssh_client: true
driver: vieux/sshfs
driver_options:
# I am using a local private key
IdentityFile: "{{ lookup('file', ansible_ssh_private_key_file) }}"
port: 22
# I am able to get the machines that is running the tasks private ip in here, but I can't get the other machines one.
# I need each to point to the others private IP address.
sshcmd: "<username>@:{{ The other hosts Private IPv4 address }}//etc/docker/storage"
How should I access the other nodes IP address on each machine?