1

I have an infinite script written in Python which connects to Postgresql and inserts there a record when the person appears in front of the camera connected to my computer.

I would like to know what is the best way to connect (and store connection) to the database, if it is necessary to connect and close every time when the person appears or if I can somehow store connection. Because when I create a connection before the infinite loop and there is no activity in front of the camera, the connection stays idle and when the script wants to insert a new row after some time, the connection is closed. When I connect every time I want to insert a new row, there is no problem, but this is slower.

Thank you for any suggestions.

Lukeluha
  • 253
  • 2
  • 5
  • 11
  • Maybe you need to configure the database connection not to timeout - see answer here from piro (and look at the comments on it too) here https://stackoverflow.com/questions/19963954/set-transaction-query-timeout-in-psycopg2/20101377 – DisappointedByUnaccountableMod May 25 '18 at 09:46
  • How often are people appearing in front of your camera that this is a performance-sensitive operation? Are you storing every non-empty video frame? – jjanes May 25 '18 at 12:54
  • Currently, we are testing it at our offices, which means that during the day people are appearing very often and there is a blank space between 6 PM to 6 AM. After this time when a new person appears, DB connection is closed. And yes, i am storing every non-empty video frame. – Lukeluha May 25 '18 at 13:21

1 Answers1

0

A connection pool works well for this kind of thing. I have not worked with it in production (using mainly Django or SQLAlchemy), but psycopg2.pool includes a few different implementations (SimpleConnectionPool or PersistentConnectionPool) that would probably fit your need. Generally speaking, a pool not only helps with managing connections as a shared resource, but also testing and re-initializing the connection when it's needed.

from psycopg2 import pool
conn_pool = pool.PersistentConnectionPool(minconn, maxconn, **dbopts)

def work_method():
    conn = conn_pool.getconn()
    with conn.cursor() as stmt:
        stmt.execute(sql)
    conn_pool.putconn(conn)

The putconn is extremely important, so that an exception doesn't leave the pool thinking the connection is still in use. Would be good to handle it as a context manager:

import contextlib

@contextlib.contextmanager
def get_db_connection():
    conn = conn_pool.getconn()
    yield conn
    conn_pool.putconn(conn)

def work_method():
    with get_db_connection() as conn:
        with conn.cursor() as stmt:
            stmt.execute(sql)

Hope that helps.

bimsapi
  • 4,985
  • 2
  • 19
  • 27