1

I'm trying to generate an inventory from AWS accounts and write it in a single CSV file.

I have 5 accounts like Dev, Test, DevOps, Prepared, and Prod. Normally I generate using the below bash script, this requires some manual tasks again;

#!/bin/bash
reg="ap-south-1"
input="profile.txt" ##This file contain all the profile name which configured on ~/.aws/credentials file
while IFS= read -r line
do
    aws ec2 describe-instances --profile ${line} --region ${reg} --query "Reservations[*].Instances[*].[Tags[?Key=='Name']|[0].Value,InstanceId,InstanceType,PrivateIpAddress,State.Name,Platform,Placement.AvailabilityZone]" --output text >> ec2_inventory.csv
done < "$input"

So, here I have written a python boto3 script to generate ec2 inventory;

It's working as expected for a single AWS profile. script below;

import boto3

session = boto3.Session(profile_name='dev')
ec2 = session.client('ec2', region_name='ap-south-1')
response = ec2.describe_instances()

import datetime
import csv
time = datetime.datetime.now().strftime ('%Y-%m-%d-%H-%M-%S')
filename_describe_instances = ('ec2_inventory_me-south-1_' + time + '.csv')
fieldnames = ['Instance_Name','ImageId', 'InstanceId', 'InstanceType', 'Availability_Zone', 'Platform', 'PrivateIpAddress','PublicIpAddress', 'State', 'SubnetId','VpcId', 'Environment', 'AccountId']



with open(filename_describe_instances, 'w', newline='') as csvFile:
    writer = csv.writer(csvFile, dialect='excel')
    writer.writerow(fieldnames)
    for Reserv in response['Reservations']:
        for Insta in Reserv['Instances']:
            instance_imageid = Insta.get('ImageId', 'NULL')
            instance_InstanceId = Insta.get('InstanceId', 'NULL')
            instance_InstanceType = Insta.get('InstanceType', 'NULL')
            instance_Availability_Zone = Insta['Placement'].get('AvailabilityZone', 'NULL')
            instance_Platform = Insta.get('Platform', 'Linux')
            instance_Private_IP = Insta.get('PrivateIpAddress', 'NULL')
            instance_Public_IP = Insta.get('PublicIpAddress', 'NULL')
            instance_State = Insta['State'].get('Name', 'NULL')
            instance_Subnet = Insta.get('SubnetId', 'NULL')
            instance_VPCID = Insta.get('VpcId', 'NULL')
            instance_OwnerId = Reserv.get('OwnerId', 'NULL')

            tags_list = []
            for n in Insta.get('Tags', 'NULL'):
                if n.get('Key', 'NULL') == 'Name':
                    instance_Name = n.get('Value', 'NULL')
                if n.get('Key', 'NULL') == 'Environment':
                    instance_Environment = n.get('Value', 'NULL')

            raw = [instance_Name,
                   instance_imageid,
                   instance_InstanceId,
                   instance_InstanceType,
                   instance_Availability_Zone,
                   instance_Platform,
                   instance_Private_IP,
                   instance_Public_IP,
                   instance_State,
                   instance_Subnet,
                   instance_VPCID,
                   instance_Environment,
                   instance_OwnerId]

            writer.writerow(raw)
            for o in raw:
                o = 'NULL'
            raw = []

csvFile.close()

So, could someone help me with this script to generate inventory from multiple AWS accounts and write it in a single CSV file?

jawad846
  • 683
  • 1
  • 9
  • 21
  • 1
    You could use the [configparser](https://docs.python.org/3/library/configparser.html) from the python standard library to read all section names (i.e. profile names) from your `~/.aws/config` and create a for loop. In each iteration you create a new session based on the profile. – Maurice Feb 07 '21 at 12:34
  • Thanks, @Maurice. could you please help with examples? I'm new to this python boto3 – jawad846 Feb 07 '21 at 15:02
  • I have configured all my credential in ~/.aws/credential – jawad846 Feb 07 '21 at 15:03

2 Answers2

1

EDIT - revising some basic details.

The simplest thing, if you have already configured your profiles, is to loop over them, and use a boto3 Session object to get your instance details inside the loop.

# set up your .csv writer, etc outside the loop

# iterate over your profiles
profiles = ['Dev', 'Test', 'DevOps', 'Prepared', 'Prod']

for name in profiles:
    session = boto3.Session(profile_name=name)
    ec2 = session.client('ec2')
    response = ec2.describe_instances()
    #format your row and write to the .csv

You can hard-code the profile list, or look into argparse to get them from the command line.

The profile approach is good, and is something I used extensively for writing reports and audit tools spanning accounts. However, it is worth digging a little deeper on what the cross account set up should look like. You have two basic options:

  • Cross-account access via IAM User in each account - each account has a unique user with their own key and secret
  • Cross-account access via IAM Role in each account - you create a unique user in one account (e.g., a central "management" account), and create roles in the target accounts that do the actual work.

I strongly recommend the latter as a more secure and flexible alternative (unless your org is already using some form of SSO, which is conceptually similar, but is out of scope of this answer). Using roles is more secure in that you only provision/manage a single credential, and set yourself up to manage all the accounts from a central auth repository.

To set this up, the SDK supports profiles that implicitly call sts:AssumeRole by referencing other profiles. For example:

In ~/.aws/credentials:

[management]
region=us-west-2
aws_access_key_id=...
aws_secret_access_key=...

In ~/.aws/config:

[profile foo-account]
source_profile=management
role_arn=arn:aws:iam::[foo-account-id]:role/Inventory

(Note the difference in the section header names between the two files - that is intentional)

Given the above, if you run aws --profile foo-account ec2 describe-instances, the CLI will use the key/secret under profile management to call sts:AssumeRole, targeting the Inventory role. If you set the AWS_PROFILE env variable, you can omit the --profile flag. Same thing will work inside your Python script, using the profile_name argument to Session

That role is defined in each account you want to work against (with a corresponding profile), with sufficient permission to read the EC2 data you need, and must have an AssumeRolePolicyDocument that allows your IAM user to call sts:AssumeRole on the Inventory role.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::[foo-account-id]:user/[your-user]"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}

The IAM user, likewise, needs a policy that allows it to call sts:AssumeRole:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "sts:AssumeRole",
            "Resource": "*"
        }
    ]
}

You can replace the "*" resource reference with a list of the role ARNs. The AssumeRolePolicyDocument lets the named principal(s) assume the role; the user policy allows the user to call AssumeRole, against the specified resource. This seems a little counter-intuitive, but the remember that authorization requires answering two questions: can the user do it, and does the resource allow it.

Finally, as you get more advanced, once you have the basic account/role setup, you can actually omit the profiles, and just call sts:AssumeRole directly, using a list of accounts.

accounts = ['123...789', '234...890',]
sts = boto3.client('sts') #assumes you have a default profile set
for id in accounts:
    role_arn = f'arn:aws:iam::{id}:role/Inventory'
    creds = sts.assume_role(role_arn=role_arn, role_session_name='some-name')
    session = boto3.Session(aws_access_key_id=creds['AccessKeyId'], ...)
    ec2 = session.client('ec2')

If you have to do much cross-account work, taking the time to think through and configure central auth will definitely be worth it. The side effect is that, once you have cross-account roles, you can automate this by running it in a Lambda function in your management account, and doing the same permission setup on that execution role (writing the .csv to S3 instead).

bimsapi
  • 4,985
  • 2
  • 19
  • 27
  • Hi Bimsapi, However i have tried to use the above for loop for multiple aws accounts which you mentioned above but i am getting the result only last account name which added into profile, example below. "profiles = ["test1", "test2"]". I have got the ec2 instance details only for test2 account. – Bala Dec 06 '22 at 15:00
  • To be clear - the two Python snippets in the answer are for two different configurations. The first - if you have a set of profiles in your ~/.aws/config that use source_profile and role_arn to do the AssumeRole automatically (looping over a list of profiles). The second - perform the assume role explicitly in your code (looping over a list of accounts, without requiring externally configured profiles). Your comment mentions both, so without a code snippet, it's hard to say what the problem is. – bimsapi Dec 08 '22 at 03:36
  • Hi Bimsapi, Below is the code I am using, the code is working fine but i did not the get all account ec2 details into csv, only i am getting for stag aws account. only. – Bala Dec 09 '22 at 08:42
  • Hi Bimsapi, Below is the code I am using, the code is working fine but i did not the get all account ec2 details into csv, only i am getting for stag aws account. only. https://stackoverflow.com/questions/74714856/get-multiple-aws-account-ec2-inventory-using-boto3/74715750#74715750 – Bala Dec 09 '22 at 08:53
0

There's a cloud management platform that does this. It enables users to manage multiple AWS accounts from a single dashboard as well as providing AWS Inventory management & cost optimisation. It's free: https://cloudplexo.com.

Richardsop
  • 55
  • 1
  • 6