I'm trying to copy data from S3 bucket
to Redshift Database
using airflow
, here is my code:
from airflow.hooks import PostgresHook
path = 's3://my_bucket/my_file.csv'
redshift_hook = PostgresHook(postgres_conn_id='table_name')
access_key='abcd'
secret_key='aaaa'
query= """
copy my_table
FROM '%s'
ACCESS_KEY_ID '%s'
SECRET_ACCESS_KEY '%s'
REGION 'eu-west-1'
ACCEPTINVCHARS
IGNOREHEADER 1
FILLRECORD
CSV
BLANKSASNULL
EMPTYASNULL
MAXERROR 100
DATEFORMAT 'MM/DD/YYYY'
""" % ( path,
access_key,
secret_key)
redshift_hook.run(query)
But when I run this script, it raises the following error:
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: connection [SQL: 'SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted \nFROM connection \nWHERE connection.conn_id = ?'] [parameters: ('elevaate_uk_production',)]
Can I get some help with this please ? Thank you in advance.