2

I'm using a few apps running Tornado Web server which all connect to a MySql DB using mysqldb. When I spin up the server, it instantiates a DB class (below) which opens a connection to the DB. All transactions are made using this same connection - which I'm not sure is a good idea.

class RDSdb(object):

    def __init__(self):
        self.connect()

    def connect(self):
        self.connection = MySQLdb.connect(cursorclass = MySQLdb.cursors.SSDictCursor, host=self.RDS_HOST,
                                    user=self.RDS_USER, passwd=self.RDS_PASS, db=self.RDS_DB)

    def get_cursor(self):
        try:
            cursor = self.connection.cursor()
        except (AttributeError, MySQLdb.OperationalError):
            self.connect()
            cursor = self.connection.cursor()
        return cursor

    def fetch_by_query(self, query):
        cursor = self.get_cursor()
        cursor.execute(query)
        result = cursor.fetchall()
        cursor.close()
        return result

I'm pretty sure I shouldn't open/close a new connection for every transaction, but then, when should I?

I noticed something else that's a bit off, which I'm certain is related : when I need to update one of my db table's schema (ex : alter table), the whole table in question gets locked and unresponsive - until I kill my 3 apps with open connections to the DB - I realize that one of those connections was holding up this update.

Best practices when it comes to this? Ideas?

thanks.

L-R
  • 1,214
  • 1
  • 19
  • 40
  • You will have to use `SQLAlchemy` or `SGLObject` to implement connection pooling. Follow the DRY principle – ronak Mar 15 '13 at 03:16
  • possible duplicate of [MySQL - Persistent connection vs connection pooling](http://stackoverflow.com/questions/9736188/mysql-persistent-connection-vs-connection-pooling) – Air Jun 16 '14 at 23:00

0 Answers0