0

I am trying to load a huge file like 5900 lines of sql creates, inserts and alter tables into mysql database with flask-SQLalchemy.

I am parsing the file and seperate each command by splitting between ;

This works as expected.

Here is what I am having so far.

For the SQL Query execution I am using the Engine API of SQLAlchemy.

When I execute the queries it seems that the database quits its job after like 5400lines of the file, but the application logs the full execution until line 5900 without error.

When i do the creates and inserts seperately it also works, so is there a way to split the batch execution or use pooling or something like that, which does not make the database stuck.

Thank you!

Marcel Hinderlich
  • 213
  • 1
  • 3
  • 19
  • Is there any particular reason why you are loading an SQL dump via SQL Alchemy? Why not do it with terminal SQL server commands? Then, you wouldn't even have to parse. – skytreader Apr 27 '15 at 05:55
  • I have to parse it because i need to modify the tablenames as I use a special mapping like _. But after that the query execution may work like that. I'll test it out. – Marcel Hinderlich Apr 27 '15 at 13:01

0 Answers0