0

I have various Google Cloud functions which are writing and reading to a Cloud SQL database (MySQL). The processes work however when the functions happen to run at the same time I am getting a Broken pipe error. I am using SQLAlchemy with Python, MySQL and the processes are cloud functions and the db is a google cloud database.I have seen suggested solutions that involve setting timeout values to longer. I was wondering if this would be a good approach or if there is a better approach? Thanks for your help in advance.

Heres the SQL broken pipe error:

(pymysql.err.OperationalError) (2006, "MySQL server has gone away (BrokenPipeError(32, 'Broken pipe'))") (Background on this error at: http://sqlalche.me/e/13/e3q8)

Here are the MySQL timeout values:

show variables like '%timeout%';
+-------------------------------------------+----------+
| Variable_name                             | Value    |
+-------------------------------------------+----------+
| connect_timeout                           | 10       |
| delayed_insert_timeout                    | 300      |
| have_statement_timeout                    | YES      |
| innodb_flush_log_at_timeout               | 1        |
| innodb_lock_wait_timeout                  | 50       |
| innodb_rollback_on_timeout                | OFF      |
| interactive_timeout                       | 28800    |
| lock_wait_timeout                         | 31536000 |
| net_read_timeout                          | 30       |
| net_write_timeout                         | 60       |
| rpl_semi_sync_master_async_notify_timeout | 5000000  |
| rpl_semi_sync_master_timeout              | 3000     |
| rpl_stop_slave_timeout                    | 31536000 |
| slave_net_timeout                         | 30       |
| wait_timeout                              | 28800    |
+-------------------------------------------+----------+
15 rows in set (0.01 sec)
Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
Kai Ferrall
  • 81
  • 2
  • 12

1 Answers1

1

If you cache your connection, for performance, it's normal to lost the connection after a while. To prevent this, you have to deal with disconnection.

In addition, because you are working with Cloud Functions, only one request can be handle in the same time on one instance (if you have 2 concurrent requests, you will have 2 instances). Thus, set your pool size to 1 to save resource on your database side (in case of huge parallelization)

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76