0

I'm involved in a project that requires updating 14k database schemas in Mysql with 46+ new tables. The creation script for these additional tables takes on average 42 seconds to run per schema. When run in single thread fashion, there are no issues, but to avoid taking a week+ to complete the update, we have a tool that multithreads the application of the update script. This is done in C# using microsoft's Parallel.ForEach with a limit set on the number of threads. Currently if I push it past 4 threads the database begins responding periodically with fatal errors that when dug into, show that while waiting for a response the host forcibly closed the connection. Connection Timeout is set to 120000 with the default command timeout also set to 120000. (I've tried changing this one to int.MaxValue with no discernible difference) The database is set with 'wait_timeout' to 28800 and 'interactive_timeout' to 28800 as well.

The relevant portion of the exception is:

MySql.Data.MySqlClient.MySqlException (0x80004005): Fatal error encountered during command execution. ---> MySql.Data.MySqlClient.MySqlException (0x80004005): Fatal error encountered attempting to read the resultset. ---> MySql.Data.MySqlClient.MySqlException (0x80004005): Reading from the stream has failed. ---> System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.

And the code used to run the script is as follows:

    using (MySqlConnection conn = new MySqlConnection(connectionString))
    {
        conn.Open();

        using (MySqlTransaction trans = conn.BeginTransaction())
        {
            using (MySqlCommand command = conn.CreateCommand())
            {
                command.CommandText = initialCommandText + query;
                command.ExecuteNonQuery();
            }

            trans.Commit();
        }
    }

Where initial command text is just the line "using some_schema_name;\r\n"

If anyone has any ideas or potential settings that I can manipulate to increase the throughput, it would be greatly appreciated!

TChadwick
  • 868
  • 10
  • 19
  • command has a default timeout property of 30 seconds. make it larger. – jdweng Dec 06 '17 at 17:09
  • Thats what my comment about setting the command timeout to int.MaxValue was about. The connection string that I pass to it sets the default command timeout to 120000 seconds. (Over-kill I know) But thanks anyways! – TChadwick Dec 06 '17 at 17:12
  • It is not in the connection string!!! It is a property of the MySqlCommand. – jdweng Dec 06 '17 at 17:28
  • I understand that, and for kicks I tried setting that (command.CommandTimeout = int.MaxValue;) but no dice. And setting default command timeout=120000; in the connection string, changes the default command timeout for the MySqlCommand to 120000. https://www.connectionstrings.com/mysql-connector-net-mysqlconnection/specifying-default-command-timeout/ – TChadwick Dec 06 '17 at 17:41
  • "When run in single thread fashion, there are no issues" So why wouldn't the connection timeout be an issue then. 1 thread vs N threads? Taking a stab in the dark but could the issue be with having multiple connection from the same account? Maybe there's a limit? – TheRock Dec 07 '17 at 16:30
  • That is along the lines I've been thinking, this is an amazon RDS managed DB server, and in production has upwards of 400 to 500 DB connections at the same time with no issues, yet when I open up 100 DB connections to apply the update script to 100DB schemas simultaneously, it starts hitting timeouts, but the DB CPU is only at 20% utilization and our IOPS are well below the provisioning limit... – TChadwick Dec 07 '17 at 18:54

0 Answers0