0

Trying to list all the snapshots present in a region using boto3, but not able to list more than 1000. Is there a workaround for this to list all the ebs snapshots?

def ebssnapsot(aws_id):
    response = ec2_client.describe_snapshots(
        MaxResults=100000,
        )
    print(json.dumps(response, indent=2, default=str))
gkcld
  • 17
  • 6

1 Answers1

2

You have to paginate through multiple requests to get all the values. You can either do this yourself via the NextToken parameter in the describe_snapshots() call, or you can use the built-in paginator.

Mark B
  • 183,023
  • 24
  • 297
  • 295
  • How can we loop then using NextToken? – gkcld May 12 '22 at 14:35
  • https://stackoverflow.com/questions/68323546/how-to-use-nexttoken-in-boto3 – Mark B May 12 '22 at 14:35
  • thanks! Tried that workaround & it helped. but the snapshots are above 200,000 and it's unable to fetch them resulting in erroring out with timeout from reading aws service – gkcld May 13 '22 at 05:21
  • will be trying with increasing timeout - ```ec2 = boto3.client('ec2', config=Config(read_timeout=400))``` – gkcld May 13 '22 at 05:32
  • Even using the below config its erroring out with - `raise ReadTimeoutError(endpoint_url=request.url, error=e) botocore.exceptions.ReadTimeoutError: Read timeout on endpoint URL: "https://ec2.ap-south-1.amazonaws.com/"` config: `ec2_client = boto3.client('ec2',region_name='ap-south-1',config=Config(read_timeout=1000,connect_timeout=1000,retries={'max_attempts': 0}))` – gkcld May 16 '22 at 12:15
  • So some of the calls are working, but some of them time out? Or are they all timing out? – Mark B May 16 '22 at 14:25
  • all are timing out – gkcld May 17 '22 at 03:30
  • Are you running this in an AWS Lambda function? – Mark B May 17 '22 at 12:41
  • No actually on Argo Workflow, as a script – gkcld May 18 '22 at 13:22