0

I am running a python script (through teamcity) which deploys AWS cloud formation templates. As a pre-requisite all the templates and script files in the bit bucket repository should be copied to S3 bucket before the deployment. Bit bucket directory structure should be maintained in S3 also. How can python script copy files and directories from bitbucket to S3. I was using os.walk method to accomplish it, but it's not copying the subdirectories and files in subdirectories.

pwd=os.getcwd()
print("Current Dir is: "+pwd)
print("Files in the Directory: ")
for subdir, dirs, files in os.walk(pwd):
    for file in files:
        print (file)
        try:
            object_name = file
            print("object name is"+object_name)
            response = client_bucket.upload_file(file, bucket_name, object_name)
        except:
            print("Error processing account " + account_id)
            for line in str(traceback.format_exc()).splitlines():
                print(line)
Navdeep
  • 11
  • 2
  • Even if this works, I will be able to copy only the working directory. How can I copy from other directories. – Navdeep Nov 01 '22 at 14:32
  • What specifically is not working? There are many articles explaining the use of os.walk, for example [here](https://stackoverflow.com/questions/10989005/do-i-understand-os-walk-right) and [here](https://codeparttime.com/os-walk-method-python-explained/). – jarmod Nov 01 '22 at 14:43
  • 1
    The parameters to [Bucket.upload_file](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Bucket.upload_file) are Filename and Key. The bucket name is implicit (you don't need to supply it) because you're (presumably) using the [Bucket](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#bucket) interface. For example: `mybucket.upload_file(filename, object_key)` – jarmod Nov 01 '22 at 15:14

0 Answers0