0

I'm evaluating the AWS Systems Manager tool, and once configured following the official documentation, my EC2 instance is enabled to be centrally managed on it. However, I noticed that new EC2 instances (I'm not dealing with instances in on-premises environments), are not displayed. I'm not using IAM Role attached to instances, so I use Default Host Management Configuration which, according to the documentation, does not require the use of any Roles associated with the instance.

I also noticed that the vault policy associated with SSM describes a single instance. Even when I change the policy, adding others to be managed by the same vault, these are not displayed by Systems Manager.

Below, I present the code used in the created policy, enabling instance management through SSM. Where "instance-id" is reported, I add the desired instances.

Another question I have and that the documentation is not clear: Imagine that I am responsible for administering hundreds of EC2 instances, how to configure the policy to cover all at once, considering that instances can be created and stopped (terminated) at any time, without me being required to change and / or inform individually, all of them in this policy?

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "ssm:StartSession",
                "ssm:SendCommand" 
            ],
            "Resource": [
                "arn:aws:ec2:region:*account-id*:instance/*instance-id*",
                "arn:aws:ssm:region:*account-id*:document/SSM-SessionManagerRunShell" 
            ],
            "Condition": {
                "BoolIfExists": {
                    "ssm:SessionDocumentAccessCheck": "true" 
                }
            }
        },
        {
            "Effect": "Allow",
            "Action": [
                "ssm:DescribeSessions",
                "ssm:GetConnectionStatus",
                "ssm:DescribeInstanceInformation",
                "ssm:DescribeInstanceProperties",
                "ec2:DescribeInstances"
            ],
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "ssm:TerminateSession",
                "ssm:ResumeSession"
            ],
            "Resource": [
                "arn:aws:ssm:*:*:session/${aws:username}-*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "kms:GenerateDataKey" 
            ],
            "Resource": "*key-name*"
        }
    ]
}

Tasks I've already done:

a. I've reconfigured the above policy, telling you two or more instances to be managed - to no avail b. I restarted and finished the running instances, creating new ones c. I reviewed the KMS key settings d. I recreated this policy and associated it with the Systems Manager service

  • Not sure what 'vault' you are referring to. I'm not aware of that term in the context of Systems Manager. These instances need to use IMDSv2 and have the SSM agent v3.2.582.0 pre-installed and running. Are those things true of the instances that don't appear to be managed? – jarmod Jun 29 '23 at 19:59
  • More generally, double-check the requirements [here](https://docs.aws.amazon.com/systems-manager/latest/userguide/managed-instances-default-host-management.html). – jarmod Jun 29 '23 at 20:02

0 Answers0