Questions tagged [boto]

boto is an open-source Python Interface to Amazon Web Services

Boto is a Python package that provides an interface to Amazon Web Services. The code is hosted on github.com (https://github.com/boto/boto) and the documentation can be found at http://docs.pythonboto.org.

Boto supports the following services:

  • Compute
    • Amazon Elastic Compute Cloud (EC2)
    • Amazon Elastic Map Reduce (EMR)
    • AutoScaling
    • Elastic Load Balancing (ELB)
  • Content Delivery
    • Amazon CloudFront
  • Database
    • Amazon Relational Data Service (RDS)
    • Amazon DynamoDB
    • Amazon SimpleDB
  • Deployment and Management
    • AWS Identity and Access Management (IAM)
    • Amazon CloudWatch
    • AWS Elastic Beanstalk
    • AWS CloudFormation
  • Application Services
    • Amazon CloudSearch
    • Amazon Simple Workflow Service (SWF)
    • Amazon Simple Queue Service (SQS)
    • Amazon Simple Notification Server (SNS)
    • Amazon Simple Email Service (SES)
  • Networking
    • Amazon Route53
    • Amazon Virtual Private Cloud (VPC)
  • Payments and Billing
    • Amazon Flexible Payment Service (FPS)
  • Storage
    • Amazon Simple Storage Service (S3)
    • Amazon Glacier
    • Amazon Elastic Block Store (EBS)
    • Google Cloud Storage
  • Workforce
    • Amazon Mechanical Turk
  • Other
    • Marketplace Web Services
2362 questions
37
votes
6 answers

How to create an ec2 instance using boto3

Is it possible to create an ec2 instance using boto3 in python? Boto3 document is not helping here, and I couldn't find any helping documents online. please provide some sample codes/links.
MikA
  • 5,184
  • 5
  • 33
  • 42
36
votes
7 answers

How to change metadata on an object in Amazon S3

If you have already uploaded an object to an Amazon S3 bucket, how do you change the metadata using the API? It is possible to do this in the AWS Management Console, but it is not clear how it could be done programmatically. Specifically, I'm using…
natevw
  • 16,807
  • 8
  • 66
  • 90
36
votes
1 answer

Change the number of request retries in boto3

In boto3 or botocore, how do I do the equivalent of setting the number of request retries? e.g. in boto2 from boto import config config.set('Boto', 'num_retries', '20') How do I do this in boto3? I've…
DG812
  • 460
  • 1
  • 5
  • 5
36
votes
3 answers

Querying for greatest value of Range key on AWS DynamoDb

What is the DynamoDB equivalent of SELECT MAX(RANGE_KEY) FROM MYTABLE WHERE PRIMARYKEY = "value" The best I can come up with is from boto.dynamodb2.table import Table as awsTable tb = awsTable("MYTABLE") rs =…
Vishal
  • 2,097
  • 6
  • 27
  • 45
36
votes
2 answers

Boto - Uploading file to a specific location on Amazon S3

This is the code I'm working from import sys import boto import boto.s3 # AWS ACCESS DETAILS AWS_ACCESS_KEY_ID = '' AWS_SECRET_ACCESS_KEY = '' bucket_name = AWS_ACCESS_KEY_ID.lower() + '-mah-bucket' conn = boto.connect_s3(AWS_ACCESS_KEY_ID,…
Jimmy
  • 12,087
  • 28
  • 102
  • 192
33
votes
5 answers

Recommended way to manage credentials with multiple AWS accounts?

What is the best way to manage multiple Amazon Web Services (AWS) accounts through boto? I am familiar with BotoConfig files, which I'm using. But each file describes only a single account...and I am working with more than just the one organization.…
Jonathan Eunice
  • 21,653
  • 6
  • 75
  • 77
32
votes
2 answers

Fastest way to download 3 million objects from a S3 bucket

I've tried using Python + boto + multiprocessing, S3cmd and J3tset but struggling with all of them. Any suggestions, perhaps a ready-made script you've been using or another way I don't know of? EDIT: eventlet+boto is a worthwhile solution as…
Jagtesh Chadha
  • 2,632
  • 2
  • 23
  • 30
31
votes
5 answers

How to generate a temporary url to upload file to Amazon S3 with boto library?

I know how to download a file in this way: key.generate_url(3600) But when I tried to upload: key.generate_url(3600, method='PUT'), the url didn't work. I was told: The request signature we calculated does not match the signature you provided. Check…
michael.luk
  • 551
  • 2
  • 6
  • 11
29
votes
2 answers

Consuming a kinesis stream in python

I cant seem to find a decent example that shows how can I consume an AWS Kinesis stream via Python. Can someone please provide me with some examples I could look into? Best
aliirz
  • 1,008
  • 2
  • 13
  • 25
28
votes
7 answers

How to download the latest file of an S3 bucket using Boto3?

The other questions I could find were refering to an older version of Boto. I would like to download the latest file of an S3 bucket. In the documentation I found that there is a method list_object_versions() that gets you a boolean IsLatest.…
jz22
  • 2,328
  • 5
  • 31
  • 50
28
votes
4 answers

How to get the row count of a table instantly in DynamoDB?

I'm using boto.dynamodb2, and it seems I can use Table.query_count(). However it had raised an exception when no query filter is applied. What can I do to fix this? BTW, where is the document of filters that boto.dynamodb2.table.Table.Query can use?…
Kane Blueriver
  • 4,170
  • 4
  • 29
  • 48
28
votes
7 answers

Boto [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed while connecting to S3

I am trying to connect to S3 using boto, but it seems to fail. I've tried some workarounds, but they don't seem to work. Can anyone please help me with this. Below is the code. import boto if not boto.config.has_section('Credentials'): …
Siddarth
  • 1,000
  • 1
  • 10
  • 17
28
votes
5 answers

How to store data in GCS while accessing it from GAE and 'GCE' locally

There's a GAE project using the GCS to store/retrieve files. These files also need to be read by code that will run on GCE (needs C++ libraries, so therefore not running on GAE). In production, deployed on the actual GAE > GCS < GCE, this setup…
28
votes
4 answers

How can I copy files bigger than 5 GB in Amazon S3?

Amazon S3 REST API documentation says there's a size limit of 5gb for upload in a PUT operation. Files bigger than that have to be uploaded using multipart. Fine. However, what I need in essence is to rename files that might be bigger than that. As…
Pedro Werneck
  • 40,902
  • 7
  • 64
  • 85
27
votes
1 answer

Django Storage Backend for S3

I'm looking for a good Django custom storage backend for use with Amazon S3. I've been googling around and found a lot of blog posts with code snippets or half-baked gist.github.com one-off jobs. But I can't seem to find a solid, well-tested one. Is…
Chris W.
  • 37,583
  • 36
  • 99
  • 136