32

I'm using aws ec2 service with awscli. Now I want to put all the commands I type in the console into a python script. I see that if I write import awscli inside a python script it works fine but I don't understand how to use it inside the script. For instance how do I execute the commands aws ec2 run-instances <arguments> inside the python script after import awscli? Just to make it clear, I'm not looking for a solution like os.system('aws ec2 run-instances <arguments>'), I'm looking for something like

import awscli
awscli.ec2_run-instances(<arguments>)
e271p314
  • 3,841
  • 7
  • 36
  • 61

6 Answers6

25

You can do it with brilliant sh package. You could mimic python package with sh doing wrapping for you.

import sh
s3 = sh.bash.bake("aws s3")
s3.put("file","s3n://bucket/file")
Yaro Nosa
  • 372
  • 3
  • 8
  • 1
    The `sh` package is definitely interesting. Thanks for the pointer. But it is [not for Windows](http://amoffat.github.io/sh/sections/faq.html?highlight=windows#will-windows-be-supported) though (will use the package for my *nix platforms) – Kiran Subbaraman Jan 23 '18 at 11:23
  • 1
    this should be the accepted answer, due to lack of features and strange limits of boto3 – ricoms Jun 19 '21 at 08:31
  • This only outputs things like "" instead of expected aws output – motizukilucas Mar 03 '22 at 15:27
  • I get `STDERR: /usr/bin/bash: aws s3: No such file or directory` – ru111 Aug 31 '22 at 15:53
18

The CLI would be more suited for the shell prompt, for a better python API, check the boto library. This example shows how to launch an instance: http://boto.readthedocs.org/en/latest/ec2_tut.html

Julio Faerman
  • 13,228
  • 9
  • 57
  • 75
  • 5
    except the boto library cant do some things that the awscli library can do. e.g. s3 sync. – Erik K Jul 05 '19 at 16:40
18

Boto3 doesn't have everything the cli has so you may have to use something from the cli in a script once in a blue moon. I can't find an analog for aws deploy push in boto3 for example so here is how I push to s3 with the cli from a python script. Although to Julio's point, I use boto for everything else.

import subprocess

cmd='aws deploy push --application-name SomeApp --s3-location  s3://bucket/Deploy/db_schema.zip --ignore-hidden-files' 
push=subprocess.Popen(cmd, shell=True, stdout = subprocess.PIPE)
print push.returncode
ddtraveller
  • 1,029
  • 12
  • 18
2

Well, you can run aws cli command by using subprocess in python script. For instance, suppose to get the s3 bucket list. Then,

import subprocess

push=subprocess.call(['aws', 's3', 'ls', '--recursive', '--human-readable', '--summarize'])

or

import subprocess

push=subprocess.run(['aws', 's3', 'ls', '--recursive', '--human-readable', '--summarize'])

Wish help for you.

Venus713
  • 1,441
  • 12
  • 24
2

You can use awscli direclty in python

from awscli.clidriver import create_clidriver
cli_driver = create_clidriver()
result = cli_driver.main(args=["s3api", "list-buckets"])

That way you trigger the command but the result will only contain the return code. I haven't found a way to capture the actual output from the tool. Additionally the instance will exit if things go wrong.

So, I don't recommend using this. Just wanted to add this for informational purpose

Oliver Heyme
  • 161
  • 2
  • 6
  • Just in case you *want* to go this path (like I did, when [writing a daemon for AWS CLI](https://github.com/janakaud/aws-cli-repl)), you can capture outputs/errors by overriding `sys.stdout` and `sys.stderr` – Janaka Bandara Aug 04 '23 at 11:33
0

Fix/example to the answer of smokeny using sh. As an answer since I can't comment yet and edit is not working.

from sh import aws
aws("s3","cp","s3://folder/", ".", "--recursive", "--exclude", "*", "--include", "*.txt")
  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jul 05 '23 at 18:00