Questions tagged [pandas-to-sql]

Pandas DataFrame method that writes the object's records to a SQL database. Be sure to also include the [pandas] tag.

Pandas DataFrame method to_sql can be used to write its records to a SQL database.

The documentation:

144 questions
1
vote
2 answers

Speeding up Pandas to_sql method

I am trying to use Pandas' to_sql method to upload multiple csv files to their respective table in a SQL Server database by looping through them. fileLoc = r'C:\Users\hcole\Downloads\stats.csv\\' dfArray = ['file1', 'file2', 'file3',…
hstan4
  • 25
  • 6
1
vote
0 answers

Cant find a way to insert pandas dataframe into MySql

So basically I have a function that retrieves data from an API and produces a data series with columns (Data, Realtime Start, Value) I then have a program that inserts table named using values from my dictionary. I, however, cannot figure out how to…
Robex
  • 47
  • 3
  • 9
1
vote
1 answer

How to load data from dataFrame into mssql table

engine = sqlalchemy.create_engine('mssql+pyodbc://:@/', pool_pre_ping=True) I have data to be loaded into a DataFrame, by using connection string when i loop through each row in DF i am able to insert data…
Renuka
  • 11
  • 1
1
vote
3 answers

Pandas errors in writing chunks to database with df.to_sql()

Existing Database and Desired Outcome: I have a larger SQLite database (12gb, tables with 44 million+ rows) that I would like to modify using Pandas in Python3. Example Objective: I hope to read one of these large tables (44 million rows) into a DF…
1
vote
0 answers

Upload dataframe to already created table in database with specific data types

1 - I have a dataframe in pandas; a table in ms sql with primary keys and specific column types. The connection to data is established with sqlalchemy. When I use to_sql method to upload dataframe with if_exists='replace' the data is uploaded, but…
ilyas
  • 609
  • 9
  • 25
1
vote
2 answers

Dump a pandas DataFrame which contains column name as "name" to mysql table

I am trying to dump my crawled data to mysql using pandas and to_sql. I am approaching in two ways 1> {# -*- coding: utf-8 -*- import numpy as np import pandas as pd import MySQLdb from sqlalchemy import create_engine import os from pandas.io import…
karthik
  • 67
  • 7
1
vote
0 answers

Pandas Dataframe to_sql with Flask-SQLAlchemy

I am working with two csv files that i have merged into one dataframe that i am currently storing as an sql databse using pandas to_sql(). I am writing all my app with Flask and i would like to use SQLAlchemy. So far i have created the sql database…
Ana
  • 167
  • 3
  • 17
1
vote
0 answers

('42000', '[42000] [Microsoft][SQL Server Native Client 11.0][SQL Server]Error converting data type varchar to bigint

I saw that there were similar questions in the past as well, but unfortunately I could not find the solution there. I am relatively new to Python. I have a pandas dataframe which I want to write to SQL Server using SQLalchemy dataframe.to_sql() The…
Ashish Dang
  • 41
  • 1
  • 5
0
votes
0 answers

Data loaded to Azure SQl database from pandas dataframe writes with Accents in the database

I have trying to load my pandas dataframe to my Azure Sql database using the library BCPANDAS. The problem I'm facing is that the bcpandas writes it to tables with Accents for string columns. I have tried multiple ways to over this but it doesn't…
0
votes
0 answers

Bug with function df.to_sql(if_exists = 'replace')

I'm updating a table from a dataframe that has no more than 40 rows and 8 columns (little data, user information, nothing heavy) with the following code df.to_sql(table_name,self.engine,if_exists = 'replace',index=False) Lately this method hasn't…
Yessica
  • 1
  • 1
0
votes
0 answers

SQLAlchemy & Pandas --> .to_sql() not insert all the rows

I try to insert into a mysql database a dataframe, the dataframe have 619.008 rows with this code it inserts the rows very fast, but it don't insert all the rows, it insert around 590.000 every execution it change the number I try to change the…
karkyn
  • 1
0
votes
1 answer

Load dataframe with timestamp and timezone into postgres database with "to_sql"

I have a pandas dataframe containing sensor data with timestamp columns id time temp humi min max mean std mean std 0 1.0…
0
votes
1 answer

How do I make sure not only the last element of a for loop dataframe is posted via to_sql to the database?

I try to post the dataframe to pgadmin. The dataframe was edited in a for loop, but when I post the data it posts only the last element in the for loop. I tried to use the to_sql inside the for loop and outside. tables = camelot.read_pdf(pdf_path,…
0
votes
0 answers

how to speed up transfering df to sql

EDIT: After I saw this thread I removed the multi and chunksize parameters. But now I get the error The "ProgrammingError: (pyodbc.ProgrammingError) ('Unknown object type list during describe', 'HY000')" Is it because in my dataframe there are lists…
Lehas123
  • 21
  • 5
0
votes
0 answers

What could be the issue python sql alchemy is not loading data to tables and not even showing any error , even if the password is wrong

I have python script where i am connecting to PostgreSQL database and trying to load the data to the tables. timer('start', 'Writing values to Postgres database...') engine =…
Ritika
  • 1
  • 1