I am trying to upload a very big pandas df (13230 rows x 2502 cols) into a postgres database. I am uploading the dataframe with the function df_to_sql but it gives me this error:
tables can have at most 1600 columns
Therefore I split the df into two dfs (13230 rows x 1251 cols each df) with the idea to merge them later. But when I try to upload the first df into the db I receive the following error:
row is too big: size 8168, maximum size 8160
How can I manage this? I would love to upload the df as a whole (13230 rows x 2502 cols) without merging it later.