0

I am new to AWS world, started to explore recently. After running Athena Query, I am trying to copy the query result file generated, to another s3 location. The problem I am getting here is : file_name Here I'm trying to build dynamically, using the query id , that Athena generated and by appending with .csv file extension. Which is generating exception:

botocore.errorfactory.NoSuchKey: An error occurred (NoSuchKey) when calling the CopyObject operation: The specified key does not exist.
If hardcode the file name e.g : file_name = '30795514-8b0b-4b17-8764-495b25d74100.csv' inside single quote '', my code is working fine. Copying is getting done. Please help me how can I dynamically build source and destination file name dynamically.

import boto3
s3 = session.client('s3');
athena_client = boto3.client(
"athena",
aws_access_key_id=AWS_ACCESS_KEY,
aws_secret_access_key=AWS_SECRET_KEY,
region_name=AWS_REGION,);

def main():
query = "select * from test_table";
response = athena_client.start_query_execution(
    QueryString=query,
    ResultConfiguration={"OutputLocation":       RESULT_OUTPUT_LOCATION}
)
queryId = response['QueryExecutionId'];
src_bucket = 'smg-datalake-prod-athena-query-results'
dst_bucket = 'smg-datalake-prod-athena-query-results'
file_name = str(queryId+".csv");
copy_object(src_bucket, dst_bucket, file_name)

def copy_object(src_bucket, dst_bucket, file_name):
src_key = f'python-athena/{file_name}';
dst_key = f'python-athena/cosmo/rss/v2/newsletter/kloka_latest.csv';
# copy object to destination bucket
s3.copy_object(Bucket=dst_bucket, CopySource={'Bucket': src_bucket, 'Key': src_key}, Key=dst_key);
sks
  • 157
  • 2
  • 7
  • 1
    Please provide a [mre], we'll simply get other errors if we execute the code you shared. For example what is `queryId`? You are calling `copy_object` before even declaring it, etc. – Abdul Aziz Barkat Feb 16 '23 at 05:50
  • 1
    The error is clear. The key that you are using does not exist. Only you can check your buckets for what files you have or don't have. – Marcin Feb 16 '23 at 05:58
  • Does this answer your question? [How to query in AWS athena connected through S3 using lambda functions in python](https://stackoverflow.com/questions/50291963/how-to-query-in-aws-athena-connected-through-s3-using-lambda-functions-in-python) – Abdul Aziz Barkat Apr 12 '23 at 06:31

1 Answers1

1

After executing Athena Query, I just put some sleep. then I tried to move file to another location, it started to work. As it was running so fast , by the time file is available in query results bucket, my code was trying to copy the file, which yet to be present.

sks
  • 157
  • 2
  • 7
  • `start_query_execution` just starts the query it doesn't wait for it to finish. "some sleep" might work in some cases but you ideally need to write code to wait for the query to finish running (Since longer queries might need _more_ sleep). – Abdul Aziz Barkat Feb 16 '23 at 06:50