I am trying to load a huge file like 5900 lines of sql creates, inserts and alter tables into mysql database with flask-SQLalchemy.
I am parsing the file and seperate each command by splitting between ;
This works as expected.
Here is what I am having so far.
For the SQL Query execution I am using the Engine API of SQLAlchemy.
When I execute the queries it seems that the database quits its job after like 5400lines of the file, but the application logs the full execution until line 5900 without error.
When i do the creates and inserts seperately it also works, so is there a way to split the batch execution or use pooling or something like that, which does not make the database stuck.
Thank you!