I am running a python script (through teamcity) which deploys AWS cloud formation templates. As a pre-requisite all the templates and script files in the bit bucket repository should be copied to S3 bucket before the deployment. Bit bucket directory structure should be maintained in S3 also. How can python script copy files and directories from bitbucket to S3. I was using os.walk method to accomplish it, but it's not copying the subdirectories and files in subdirectories.
pwd=os.getcwd()
print("Current Dir is: "+pwd)
print("Files in the Directory: ")
for subdir, dirs, files in os.walk(pwd):
for file in files:
print (file)
try:
object_name = file
print("object name is"+object_name)
response = client_bucket.upload_file(file, bucket_name, object_name)
except:
print("Error processing account " + account_id)
for line in str(traceback.format_exc()).splitlines():
print(line)