I am currently developing an app with Django and PostgreSQL. Often, new information is sent to database (sometimes one time a day, sometimes over 30000 times a day). I have an UI (vue.js) and I want to see the data from the database showing to the client in real time.
Currently I am using Django Channels. I implemented websocket and used psycopg2 library to listen to notifications from postgresql triggers. My consumer looks like this:
class WSConsumer(WebsocketConsumer):
def connect(self):
self.accept()
# the whole table is sent to the client after the connection
self.send(json.dumps(ClientSerializer(Client.objects.all(), many=True).data))
# connecting to database
conn = psycopg2.connect(dbname='...', user='...')
conn.set_isolation_level(psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT)
curs = conn.cursor()
curs.execute("LISTEN update_clients_table;")
# then I listen to any new data in this table and send it to the client
while True:
if select.select([conn],[],[],5) == ([],[],[]):
print("Timeout")
else:
conn.poll()
while conn.notifies:
notify = conn.notifies.pop(0)
print("Got NOTIFY:", notify.pid, notify.channel, notify.payload)
message = notify.payload
self.send(message)
def disconnect(self):
self.close()
This is an example of handling changes in 1 table. I have 20 tables in total, and around 100 clients that can connect to the websocket.
My question: is it a good way of building real-time app with Django and PostgreSQL? Is it simple, convenient, secure? Are there other ways to build such an app?
Thanks in advance!