1

I have 3 remote VMs and 1 ansible node.

I am getting the hostname of some VMs by running hostname command on those remote VMs through ansible shell module and registering that output in hostname_output variable.

Then I want to add those VM's IP (collected using gather_facts: True, {{ ansible_default_ipv4.address }} ) with their hostname and append it to a file temp_hostname on localhost, hence I am delegating the task to localhost.

But the issue is, when I see on console, the lineinfile module says that line has been added when the module executed for each node and delegated to localhost, but when I check the file on the localhost, only 1 entry is shown on localhost instead of 3.

---
- name: get hostnames of dynamically created VMs
  hosts: all
  remote_user: "{{ remote_user }}"
  gather_facts: True

  tasks:
   - name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
     shell: hostname
     register: hostname_output

   - name: writing hostname_output in ansible node in file on ansible node
     lineinfile:
      line: "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}"
      dest: temp_hostname
      state: present
     delegate_to: 127.0.0.1

I even tried with copy module as specified in Ansible writing output from multiple task to a single file , but that also gave same result i.e 1 entry only.

---
- name: get hostnames of dynamically created VMs
  hosts: all
  remote_user: "{{ remote_user }}"
  gather_facts: True

  tasks:
   - name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
     shell: hostname
     register: hostname_output

   - name: writing hostname_output in ansible node in file on ansible node
     copy:
        content: "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}"
        dest: /volume200gb/sushil/test/code_hostname/temp_hostname
     delegate_to: 127.0.0.1

Finally when I used shell module with redirection operator, it worked as I wanted i.e 3 entries in file on localhost.

---
- name: get hostnames of dynamically created VMs
  hosts: all
  remote_user: "{{ remote_user }}"
  gather_facts: True

  tasks:
   - name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
     shell: hostname
     register: hostname_output

   - name: writing hostname_output in ansible node in file on ansible node
     shell: echo -e "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}" >> temp_hostname
     delegate_to: 127.0.0.1

I am calling this ansible-playbook get_hostname.yml using command:

ansible-playbook -i hosts get_hostname.yml --ssh-extra-args="-o StrictHostKeyChecking=no" --extra-vars "remote_user=cloud-user" -vvv

My hosts file is:

10.194.11.86 private_key_file=/root/.ssh/id_rsa
10.194.11.87 private_key_file=/root/.ssh/id_rsa
10.194.11.88 private_key_file=/root/.ssh/id_rsa

I am using ansible 2.1.0.0

I am using default ansible.cfg only, no modications

My question is why lineinfile and copy module didn't work? Did I miss anything or wrote something wrongly

Sushil Kumar Sah
  • 1,042
  • 10
  • 13

3 Answers3

0

I get inconsistent results using your first code block with lineinfile. Sometimes I get all 3 IPs and hostnames in the destination file and sometimes I only get 2. I'm not sure why this is happening but my guess is that Ansible is trying to save changes to the file at the same time and only one change gets picked up.

The second code block won't work since copy will overwrite the file unless content matches what is already there. The last host that runs will be the only IP/hostname in the destination file.

To work around this, you can loop over your play_hosts (the active hosts in the current play) and reference their variables using hostvars.

- name: writing hostname_output in ansible node in file on ansible node
  lineinfile:
    line: "{{ hostvars[item]['ansible_default_ipv4'].address }} {{ hostvars[item]['hostname_output'].stdout }}"         
    dest: temp_hostname
    state: present
  delegate_to: 127.0.0.1
  run_once: True
  with_items: "{{ play_hosts }}"

Or you can use a template with the same logic

- name: writing hostname_output in ansible node in file on ansible node
  template:
    src: IP_hostname.j2
    dest: temp_hostname
  delegate_to: 127.0.0.1
  run_once: True

IP_hostname.j2

{% for host in play_hosts %}
{{ hostvars[host]['ansible_default_ipv4'].address }} {{ hostvars[host]['hostname_output'].stdout }}
{% endfor %}
kfreezy
  • 1,499
  • 12
  • 16
  • Can we say that there is some bug in lineinfile module, because output should be consistent, not sometimes one, sometimes two or three – Sushil Kumar Sah Aug 16 '17 at 16:35
  • Might be worthwhile to open an issue on ansible's github and see what the authors have to say - https://github.com/ansible/ansible/issues. – kfreezy Aug 16 '17 at 16:42
  • raised the issue on github https://github.com/ansible/ansible/issues/28313 – Sushil Kumar Sah Aug 17 '17 at 04:49
  • reply from github on the bug: Concurrent write access to the same file is always going to be problematic, as they all read in the same file and then modify it. The solution here is to ensure no 2 tasks are running at the same time using serial: 1. – Sushil Kumar Sah Aug 18 '17 at 02:11
0

I tried to reproduce your issue and it did not happen for me, I suspect this is a problem with your version of ansible, try with the latest.

That being said, I think you might be able to make it work using serial: 1, it is probably an issue with file locking that I don't see happening in ansible 2.3. I also think that instead of using a shell task to gather the hostname you could use the ansible_hostname variable which is provided as an ansible fact, and you can also avoid gathering ALL facts if all you want is the hostname by adding a task for that specifically. In the end, it would look like this:

---
- name: get hostnames of dynamically created VMs
  hosts: all
  serial: 1  
  remote_user: "{{ remote_user }}"

  tasks:
  - name: Get hostnames
    setup:
      filter: ansible_hostname

  - name: writing hostname_output in ansible node in file on ansible node
    lineinfile:
      line: "{{ ansible_default_ipv4.address }} {{ ansible_hostname }}"
      dest: temp_hostname
      state: present
    delegate_to: 127.0.0.1
  • I tried the same code on Ansible 2.3.1.0 also, the only difference was that, when I tried on ansible 2.1.0.0 I used to get only 1 entry, whereas when I tried on Ansible 2.3.1.0, I used to get two entries and sometimes randomly 3 entries. – Sushil Kumar Sah Aug 17 '17 at 04:23
  • Hmm, strange, I tried it several times without issues. Do you have set `ansible_connection=local` in your inventory file? And did you try as above setting serial? – David Ponessa Aug 17 '17 at 13:10
  • No, I have not set any such thing in inventory file, you can see my inventory file also in the description of question (hosts) – Sushil Kumar Sah Aug 17 '17 at 17:14
  • Try adding 127.0.0.1 to your hosts file with sensible_connection=local – David Ponessa Aug 17 '17 at 20:13
  • I don't want to do that, because if I do so, then tasks will run on localhost also, as I want to give all hosts in playbook – Sushil Kumar Sah Aug 18 '17 at 01:26
  • I submitted bug on github and they replied: Concurrent access to the same file is always going to be problematic, as they all read in the same file and then modify it. The solution here is to ensure no 2 tasks are running at the same time using serial: 1. notabug – Sushil Kumar Sah Aug 18 '17 at 02:05
  • you used serial: 1 , which is the correct way, hence you didn't get error. I had not used serial: 1 and since ansible works parallely that why I was getting error. Thanks for your time in looking over my issue. – Sushil Kumar Sah Aug 18 '17 at 02:08
0

The problem is here that there is multiple concurrent writes to only one file. That leads to unexpected results:

A solution for that is to use serial: 1 on your play, which forces non-parallel execution among your hosts.

But it can be a performance killer depending on the number of hosts.

I would suggest using another solution: instead of writing to only one file, each host delegation could write on its own file (here using the inventory_hostname value). Therefore, it will have no more concurrent writes.

After that, you can use the module assemble to merge all the file in one. Here is an example (untested):

---
- name: get hostnames of dynamically created VMs
  hosts: all
  remote_user: "{{ remote_user }}"
  gather_facts: True

  tasks:
  - name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
    shell: hostname
    register: hostname_output

  - name: deleting tmp folder
    file: path=/tmp/temp_hostname state=absent
    delegate_to: 127.0.0.1
    run_once: true

  - name: create tmp folder
    file: path=/tmp/temp_hostname state=directory
    delegate_to: 127.0.0.1
    run_once: true

  - name: writing hostname_output in ansible node in file on ansible node
    template: path=tpl.j2 dest=/tmp/temp_hostname/{{ inventory_hostname }}
    delegate_to: 127.0.0.1

  - name: assemble hostnames
    assemble: src=/tmp/temp_hostname/ dest=temp_hostname
    delegate_to: '{{ base_rundeck_server }}'
    run_once: true

Obviously you have to create the tpl.j2 file.

DeLoVaN
  • 63
  • 1
  • 4