I'm involved in a project that requires updating 14k database schemas in Mysql with 46+ new tables. The creation script for these additional tables takes on average 42 seconds to run per schema. When run in single thread fashion, there are no issues, but to avoid taking a week+ to complete the update, we have a tool that multithreads the application of the update script. This is done in C# using microsoft's Parallel.ForEach with a limit set on the number of threads. Currently if I push it past 4 threads the database begins responding periodically with fatal errors that when dug into, show that while waiting for a response the host forcibly closed the connection. Connection Timeout is set to 120000 with the default command timeout also set to 120000. (I've tried changing this one to int.MaxValue with no discernible difference) The database is set with 'wait_timeout' to 28800 and 'interactive_timeout' to 28800 as well.
The relevant portion of the exception is:
MySql.Data.MySqlClient.MySqlException (0x80004005): Fatal error encountered during command execution. ---> MySql.Data.MySqlClient.MySqlException (0x80004005): Fatal error encountered attempting to read the resultset. ---> MySql.Data.MySqlClient.MySqlException (0x80004005): Reading from the stream has failed. ---> System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
And the code used to run the script is as follows:
using (MySqlConnection conn = new MySqlConnection(connectionString))
{
conn.Open();
using (MySqlTransaction trans = conn.BeginTransaction())
{
using (MySqlCommand command = conn.CreateCommand())
{
command.CommandText = initialCommandText + query;
command.ExecuteNonQuery();
}
trans.Commit();
}
}
Where initial command text is just the line "using some_schema_name;\r\n"
If anyone has any ideas or potential settings that I can manipulate to increase the throughput, it would be greatly appreciated!