I'm working on a Python application right now that uses PyQt5 and CFFI bindings to libgphoto2.
I have this section of code that will poll the camera every 1/60 of a second to get a preview image and then schedule to draw it on the screen.
def showPreview(self):
# Do we have a camera loaded at the moment?
if self.camera:
try:
# get data from the camera & turn it into a pixmap
self.__buffer = self.camera.getPreview().scaled(self.size(), Qt.KeepAspectRatio) # Scale it
# Schedule a redraw
self.update()
# Setup another show preview in 1/60 of a second
QTimer.singleShot(1000 // 60, self.showPreview)
except GPhoto2Error as e:
# Ignore any errors from libgphoto2
pass
The getPreview()
method returns a QImage
type.
When I was running this with a camera connected to my application, I noticed that my system's memory usage kept on going up and up. Right I've had it run for about 10 minutes. It started at 0.5% usage and now is up to near 20%.
Correct me if I'm wrong, but shouldn't Python's GC be kicking in and getting rid of the old QImage
objects. I suspect that they are lingering on longer than they should be.