I have the below table in SQL which has 86M rows in it :
Transactions
I am trying to get it into a dataframe as from the code below
data = cs.execute("""
select * from transactions;
""").fetch_pandas_all()
This takes much too long to load.
What is a way I can make this load faster? Is there any method I can use? Should I create the table itself in the sql statement instead of a select? Any insight would be helpful.
It is interesting because to create this table in SQL it takes about 25 seconds. But when putting the same data into a dataframe - it takes about 15 minutes. So am thinking if there is a way to achieve the same speed as SQL in python.