I am working with a client that has a 4D database. Tableau won't connect to it. (That's a whole other problem and if you know the answer to that let me know.) What we've decided to do is essentially keep two copies of the data. I am building a tool in Python that will take any arbitrary table from their database and store a copy of it in a MySQL database. It will then run periodically and update the data as new data is added.
I would prefer to use SqlAlchemy but it does not support 4D. So, I'm using pyodbc with pandas. I'm using
data_chunks = pandas.read_sql("SELECT * FROM table_name", con=pyodbc_connection, chunksize=100000)
Then I turn around and use
chunk_df.to_sql("table_name", con=sqlalchemy_mysql_connection, index=False, if_exists="append")
to write it to the MySQL database.
Unfortunately on some of the tables I'm reading in, there is corrupt data and I get a ValueError
saying that The year xxxxx
is out of range.
The last function called in the trace was data = cursor.fetchmany(chunksize)
which I believe is from pyodbc.
How can I read data from any arbitrary table and be able to handle the corrupt data gracefully and continue on?