I'm trying to make a complete copy of a Snowflake DB into PostgreSQL DB (every table/view, every row). I don't know the best way to go about accomplishing this. I've tried using a package called pipelinewise , but I could not get the access needed to convert a snowflake view to a postgreSQL table (it needs a unique id). Long story short it just would not work for me.
I've now moved on to using the snowflake-sqlalchemy package. So, I'm wondering what is the best way to just make a complete copy of the entire DB. Is it necessary to make a model for each table, because this is a big DB? I'm new to SQL alchemy in general, so I don't know exactly where to start. My guess is with reflections , but when I try the example below I'm not getting any results.
from snowflake.sqlalchemy import URL
from sqlalchemy import create_engine, MetaData
engine = create_engine(URL(
account="xxxx",
user="xxxx",
password="xxxx",
database="xxxxx",
schema="xxxxx",
warehouse="xxxx"
))
engine.connect()
metadata = MetData(bind=engine)
for t in metadata.sorted_tables:
print(t.name)
I'm pretty sure the issue is not the engine because I did do the validate.py example and it does return the version like expected. Any advice to why my code above is not working, or a better way to accomplish my goal of making a complete copy of the DB would be greatly appreciated.