0

I noticed that Gtk in Windows seems to be rendering images using the CPU rather than the GPU (whereas in Linux this does not seem to be the case).

I am creating a program using Python, Gtk3, and OpenCV which streams video from a camera and displays it in a GtkImage. The program works but the moment I resize the image to a larger resolution, the framerate seems to drop. I notice that the CPU usage is higher, the larger the image is.

Here is a snippet of code:

import gi
gi.require_version("Gtk", "3.0")
from gi.repository import Gtk, GdkPixbuf, GLib

# This method is called in a thread that read() frames from cv2.VideoCapture and displays it into a GtkImage.
def writeDisplay(uiBuilder, frame):
    # Load frame from OpenCV into pixbuf
    frame = cv.cvtColor(frame, cv.COLOR_BGR2RGB)
    h, w, d = frame.shape
    pixbuf = GdkPixbuf.Pixbuf.new_from_data(
        frame.tostring(), GdkPixbuf.Colorspace.RGB, False, 8, w, h, w*d)

    # Load image into GtkImage
    imageDisplay = uiBuilder.get_object("display")
    GLib.idle_add(imageDisplay.set_from_pixbuf, pixbuf)
    pass

In Linux, I don't notice any frame drop which suggests that GtkImage is rendered by the GPU. However, in Windows it seems to be software rendered.

I should also note that I am using PyGObject in Windows using Msys2.

Is there any way of streaming video frames from OpenCV to a Gtk3 GUI using hardware acceleration?

1 Answers1

0

GTK3 always uses the CPU to render, regardless of the platform. The only way to use the GPU to render something in GTK3 is through the GtkGLArea widget, where you have access to a GL context.

ebassi
  • 8,648
  • 27
  • 29
  • Thanks, I realized that might be the case. Do you have any examples of displaying an image in GtkGLArea (preferably an OpenCV frame)? I have been googling endlessly and I cannot seem to get an image to show up. – coolcatco888 Feb 26 '23 at 02:05
  • I have noticed that a simple: glEnable(GL_TEXTURE_2D) call will result in OpenGL throwing an 1280 error when trying to render within an GtkGLArea. – coolcatco888 Feb 26 '23 at 02:32
  • You don't need to enable GL_TEXTURE_2D: that's an old value for the fixed function pipeline in GL 1. You are probably using or referencing some very old OpenGL documentation. GTK defaults to the GL 3.2+ core contexts. See: https://community.khronos.org/t/getting-gl-invalid-enum-when-calling-glenable-gl-texture-2d/72631 As for OpenCV: sorry, I never used it, so I'm unable to help you with that. – ebassi Feb 27 '23 at 11:44
  • Is there any way to use an older GL 2.1 context that supports glEnable(GL_TEXTURE_2D)? I tried this: but it does not work area = Gtk.GLArea() area.set_required_version(2, 1) – coolcatco888 Feb 27 '23 at 22:56
  • No, there is no way to ask for a legacy GL 2.1 context; at worst, you can get a 3.0. Nobody should really be using OpenGL pre-3.2. You can find decent tutorials that do not involve writing GL code from 20 years ago. – ebassi Feb 28 '23 at 22:48
  • Thanks, I figured it out. The answer I posted here with the entire fixed code: https://stackoverflow.com/questions/75586457/how-to-render-opencv-frames-in-opengl-3-2-gtkglarea-in-python/75598170 – coolcatco888 Feb 28 '23 at 23:09