0

I know for ec2.py I can either specify environment variables via export before calling ec2.py or use a boto config file with plain-text passwords (or python keyring).

As I have the aws key and secret in ansible vault anyway, is there a way to auto-export this from the vault or any other means to pass the value to ec2.py instead of having to specify it again?

cytopia
  • 177
  • 1
  • 14

2 Answers2

1

Well you could write a simple task to dump the keys from vault into the boto3 configuration.

---
- name: Ensure AWS credentials configuration is present.
  template:
    src: credentials.j2
    dest: "/home/{{ ansible_user }}/.aws/credentials"

credentials.j2

[default]
aws_access_key_id = {{ aws_access_key_id }}
aws_secret_access_key = {{ aws_secret_access_key }}

Where aws_access_key_id and aws_secret_access_key could be stored in a vault.

The task would than need to be run against the Ansible control host (the host that executes ansible-playbook).

The keys would than be unencrypted on the Ansible control host. IMHO (I could be wrong here) you need to supply plain AWS keys to boto either via environment variables (export command) or via boto configuration.

Ansible makes API calls to AWS via boto. Boto is not part of Ansible. So there is no native way to use parameters defined in Ansible in boto. That functionality would have to be part of boto.

Henrik Pingel
  • 9,380
  • 2
  • 28
  • 39
  • Henrik, thanks for the solution. However I don't like the key to be stored in plain-text on my disk. Is there also a way to auto-export this to environmental variables prior the aws playbook run? – cytopia Mar 14 '17 at 10:39
  • Thought so. I would than go with environment variables. Write a small script which exports the keys. updated my answer. – Henrik Pingel Mar 14 '17 at 10:51
  • Hey Henrik, with the script I still have to manually enter the credentials. I already added them to the ansible-vault and would like to use these values to be auto-exported before the play runs. – cytopia Mar 14 '17 at 12:57
0

As you mentioned you're running Ansible on a EC2 instance you should actually don't use credentials but roles attached to the EC2 instance: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html

The idea is that your instance itself is able to get temporary credentials and can execute the necessary commands which are defined in this role. As you never store any credentials anywhere this is the most secure way to work with the AWS API from an EC2 instance. As Ansible relies on boto this will work out of the box - you just need to create a role which has all the necessary IAM permissions and attach it to your instance you're running Ansible on. After that your dynamic inventory will work without needing any additional credentials.

Osterjour
  • 845
  • 8
  • 12
  • The problem is that I have this dynamic inventory file `ec2.py` and when I do something like `ansible-playbook play.yml --list-hosts` i receive a python error from `ec2.py` that it does not know any credentials. So I guess it does not matter if creds are stored in any role, `ec2.py` seems to need it by some other means. And as said above, I cannot allow the boto clear-text version. Maybe we misunderstood each other – cytopia Mar 20 '17 at 11:50
  • Did you assign the role to the instance you're running the ansible-playbook command on like described in the link? Boto should automatically be able to assume the role and get temporary credentials to request the API calls then – Osterjour Mar 21 '17 at 12:46