0

I am working on an ansible script where I want to read a file on each host and set an environment variable for that host based on some text in that file. And I need that environment variable to be available during the entire playbook execution on that host.

What I have been reading is that if I define env: under a task, it is applicable only to that task and not other subsequent tasks. Is that correct?

- name: Modify server properties
  hosts: kafka_broker
  vars:
    ansible_ssh_extra_args: "-o StrictHostKeyChecking=no"
    ansible_host_key_checking: false
    contents: "{{ lookup('file', '/etc/kafka/secrets/masterkey.txt') }}"
    extract_key: "{{ contents.split('\n').2.split('|').2|trim }}"

  environment:
    CONFLUENT_KEY: "{{ extract_key }}"

This is how I am trying to get info from each host and and want to set the env variaable per host but applicable to the entire playbook for that host

- name: Slurp hosts file
  slurp:
    src: /etc/kafka/secrets/masterkey.txt
  register: masterkeyfile

- debug: msg="{{ masterkeyfile['content'] | b64decode }}"

- name: Set masterkeyfilecontent 
  set_fact:
    masterkeyfilecontent: "{{ masterkeyfile['content'] | b64decode }}" 

- name: Set masterkeyval 
  set_fact:
    masterkeyval: "{{ masterkeyfilecontent.split('\n').2.split('|').2|trim }}"

And then I want to set the env variable per host

CONFLUENT_KEY: "{{ masterkeyval }}

- debug:
    var=masterkeyval

Can that be done? How can I define my task / ansible script that will allow me to achieve this?

Thank you

adbdkb
  • 1,897
  • 6
  • 37
  • 66
  • Have you double checked you are not re-inventing the wheel? Your requirement really looks like [local facts](https://docs.ansible.com/ansible/latest/user_guide/playbooks_vars_facts.html#facts-d-or-local-facts). You can simply push an executable file in `/etc/ansible/facts.d` on each relevant host that would return the `masterkey` from your existing files and simply use the value wherever needed. – Zeitounator Feb 28 '21 at 16:10
  • I am new to ansible, but looking at the documentation at the link, not sure if that will help. After reading the text from the file, it needs to be set as an environment variable on each host, which will then be used by different tasks that follow the setting of the environment variable. These tasks are shell commands that require that variable to have been set. If you don't mind, can you show me a snippet as to how I can use it to set the env var on the host using local-facts? – adbdkb Feb 28 '21 at 16:50
  • Just use the given variable you get from the local fact in an environment stanza at play level. It will be used for all tasks. If the value is different for each host, it will be changed accordingly. – Zeitounator Feb 28 '21 at 17:21
  • being as mentioned you're new to ansible: 1. if you need to write a static file - you're doing something wrong. if you involve calls to bash - you're doing something wrong as well. read up on group_vars and inventory mgmt in ansible. – G_G Feb 28 '21 at 23:00
  • Thank you. These are a few manual steps that are being executed on each broker server that I am trying to convert in an ansible script. This way to avoid running multiple manual steps on each server separately, by logging on each server. I doon't have access to those servers, so I have to ask the admin to perform those actions. I am not creating the static file, but reading them and one of the manual steps is to execute a confluent script, which is what the call to bash is. I will look at group_vars and inventory management – adbdkb Mar 01 '21 at 00:46

1 Answers1

0

Solution 1: local facts

This solution is IMO the easiest one but requires to place a file on each target server.

let's imagine you put the following executable file in /etc/ansible/facts.d/kafka.fact. This is only a dummy example, adapt to your exact needs. I'm using jq to output a proper json string. You can echo directly if you trust the key content will not cause problems. You can also use any other executable you like (python, ruby, perl...) as long as you output a json structure

#!/bin/bash

# replace here with some logic to read the value you need.
# I'll use a static value for this example
PARSED_KEY="I'm a key that should be parsed dynamically"

# echo the json result using jq
echo $(
    jq -n \
    --arg pk "$PARSED_KEY" \
    '{masterkey: $pk}'
)

Once this is done, you can see that the facts are available for the given host. I'll only demonstrate for localhost here but this will work with any host having this local facts script:

$ ansible localhost -m setup -a filter=ansible_local
localhost | SUCCESS => {
    "ansible_facts": {
        "ansible_local": {
            "kafka": {
                "masterkey": "I'm a key that should be parsed dynamically"
            }
        }
    },
    "changed": false
}

You are now ready to use this fact wherever you need it. Note that you must of course gather facts for this value to be available. In your case, we can test with the following playbook:

---
- hosts: localhost

  environment:
    CONFLUENT_KEY: "{{ ansible_local.kafka.masterkey | default ('empty') }}"

  tasks:
    - name: echo our env var
      command: echo $CONFLUENT_KEY
      register: echo

    - name: show the result
      debug:
        msg: "Our env is: {{ echo.stdout }}"

which gives

PLAY [localhost] ***************************************************************************************************************************************************************

TASK [Gathering Facts] *********************************************************************************************************************************************************
ok: [localhost]

TASK [echo our env var] ********************************************************************************************************************************************************
changed: [localhost]

TASK [show the result] *********************************************************************************************************************************************************
ok: [localhost] => {
    "msg": "Our env is: I'm a key that should be parsed dynamically"
}

PLAY RECAP *********************************************************************************************************************************************************************
localhost                  : ok=3    changed=1    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Solution 2: custom facts module

This solution is a bit more complex but probably has better management capabilities on the long term and does not require to place files on the target server.

To be as straightforward as possible, I placed my demo facts module in a library folder adjacent to my playbook. Placing such a module inside a collection or a role is preferable for a production project but goes beyond this (already long) answer. To get more info on all these subjects you can read (non exhaustive list of references):

Create a library/kafka_facts.py file. You will have to adapt to your exact situation. In this case, I decided that the key would be placed on a single line in /tmp/keyfile.txt and I hardcoded this in my module. Note that the fact is not returned if the file does not exist. I added all document strings as advised in the examples in the above documentation link.

#!/usr/bin/python

from __future__ import (absolute_import, division, print_function)
__metaclass__ = type

DOCUMENTATION = r'''
---
module: kafka_facts

short_description: This is a test module for example

version_added: "1.0.0"

description: This is a test module for example to answer a question on StackOverflow.

author:
    - Olivier Clavel (@zeitounator)
'''

EXAMPLES = r'''
- name: Return ansible_facts
  kafka_facts:
'''

RETURN = r'''
ansible_facts:
  description: Kafka facts to add to ansible_facts.
  returned: always
  type: dict
  contains:
    kafka_masterkey:
      description: Masterkey present on server.
      type: str
      returned: when /tmp/keyfile.txt exists on server
      sample: 'f18bba7f-caf6-4897-95be-c7d3cc66f98a'
'''

from ansible.module_utils.basic import AnsibleModule


def run_module():
    module_args = dict()

    # Initialize result dict
    result = dict(
        changed=False,
        ansible_facts=dict(),
    )

    module = AnsibleModule(
        argument_spec=module_args,
        supports_check_mode=True
    )

    if module.check_mode:
        module.exit_json(**result)

    keyfile = "/tmp/keyfile.txt"
    try:
        with open(keyfile) as f:
            result['ansible_facts'] = {
                'kafka_masterkey': f.read(),
            }
    except FileNotFoundError:
        pass

    module.exit_json(**result)


def main():
    run_module()


if __name__ == '__main__':
    main()

As in my previous solution, the following demo will be done on localhost only but will work on any target server.

First we create a key in the expected file

echo $(uuidgen) > /tmp/keyfile.txt

And we can now use the module as demonstrated in the following playbook:

---
- name: Use custom facts
  hosts: localhost
  gather_facts: false

  environment:
    CONFLUENT_KEY: "{{ ansible_facts.kafka_masterkey | default('N/A') }}"

  tasks:
    - name: echo the env var
      command: echo $CONFLUENT_KEY
      register: echo_before
      changed_when: false

    - name: show our var is empty for now
      debug:
        msg: "CONFLUENT_KEY was returned as: {{ echo_before.stdout }}"

    - name: Gather our custom facts related to kafka
      kafka_facts:

    - name: Echo the env var
      command: echo $CONFLUENT_KEY
      register: echo_after
      changed_when: false

    - name: show our var is empty for now
      debug:
        msg: "CONFLUENT_KEY was returned as: {{ echo_after.stdout }}"

which gives:

PLAY [Use custom facts] ****************************************************************************************************************************************************************************************************************

TASK [echo the env var] ****************************************************************************************************************************************************************************************************************
ok: [localhost]

TASK [show our var is empty for now] ***************************************************************************************************************************************************************************************************
ok: [localhost] => {
    "msg": "CONFLUENT_KEY was returned as: N/A"
}

TASK [Gather our custom facts related to kafka] ****************************************************************************************************************************************************************************************
ok: [localhost]

TASK [Echo the env var] ****************************************************************************************************************************************************************************************************************
ok: [localhost]

TASK [show our var now has a value (if the file exists)] *******************************************************************************************************************************************************************************
ok: [localhost] => {
    "msg": "CONFLUENT_KEY was returned as: d8fc525e-3fa9-4401-aca3-19ac915e5c0d"
}

PLAY RECAP *****************************************************************************************************************************************************************************************************************************
localhost                  : ok=5    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0
Zeitounator
  • 38,476
  • 7
  • 53
  • 66
  • So, this will require me to create a shell script and copy that to a specific location on each server. I would like to be able to use the way I had originally done by parsing the file in its original location and create the environment variable. The issue with that approach is it does it only on the server where I am running the script. Is there a different way, where I do not need to do anything externally outside of the script? Also, we do not have jq installed by default on the servers and cannot get it easily added either – adbdkb Feb 28 '21 at 18:08
  • Can I not do something similar to what I have ` vars: ansible_ssh_extra_args: "-o StrictHostKeyChecking=no" ansible_host_key_checking: false contents: "{{ lookup('file', '/etc/kafka/secrets/masterkey.txt') }}" extract_key: "{{ contents.split('\n').2.split('|').2|trim }}" environment: CONFLUENT_KEY: "{{ extract_key }}"` That will work on every host? – adbdkb Feb 28 '21 at 18:17
  • You can parse a file on each server to look for a value you will add to a variable. This is what ansible does for you automatically with local facts. Feel free to do it your own way. As explained in my answer, you don't need jq, you can echo the json script yourself if you wish, or even do all this in Go or whatever language available on the server. What I demonstrated is still true: if you set an environment at play level, it will be used for all tasks in that play. – Zeitounator Feb 28 '21 at 18:18
  • `lookup` is executed on controller, not on target. – Zeitounator Feb 28 '21 at 18:19
  • Thanks for the details. My concern is being able to add a script to each target server at this location /etc/ansible/facts.d/kafka.fact - in terms of permissions. Hence I am trying to find a way to do something from within the ansible script. Do you have any pointers that I can define a task to read and parse the file on each target server and be able to set the environment value at the play level. The need for this is that the value in the master key is different for each target and and is used by the confluent task when run on that server – adbdkb Feb 28 '21 at 18:34
  • What is wrong with the code in your question? – Zeitounator Feb 28 '21 at 19:35
  • I had not realised that lookup is executed only on controller. That is what is happening. The env variable is not getting set on the other target servers where i need to run the confluent commands. I was trying to set that value at playbook level for all the servers, hence I used the vars and then set the env with the var value. Since that did not work, I tried the slurp command which gives me the value for the target server, but not sure how to use that value to set it as an environment variable on that server at a playbook level, so I don't need to define it at task levels. Thanks – adbdkb Feb 28 '21 at 20:48
  • I added an alternative solution which does not require pushing files on the target server. – Zeitounator Mar 01 '21 at 17:07