1

I use BGFX framework for rendering in an application. I'm on Linux with an nvidia graphics card, and the BGFX build I use uses OpenGL as a backend (don't want to switch to Vulkan backend).

Everything worked fine, but one new feature requires me to use EGL. The first thing I do in the main is setting EGL to use OpenGL as a rendering API with:

if (not eglBindAPI(EGL_OPENGL_API) || (eglGetError() != EGL_SUCCESS))
    //error handling

It works well.

Then I create an X11 window, I call eglGetDisplay, call eglInitialize, call eglChooseConfig, all of them returns without any error.

Then I call BGFX init, it runs well without any error.

At this point I have an initialized BGFX (using OpenGL backend), a current OpenGL context (created by BGFX):

std::cout << "GL Cont: " << glXGetCurrentContext() << std::endl; // Valid pointer
std::cout << "EGL Cont: " << eglGetCurrentContext() << std::endl; // 0x0
std::cout << "BGFX Renderer: " << bgfx::getRendererType() << std::endl; // 8 - OpenGL

Then I would like to execute the new EGL stuff related to the new feature on a different thread (I call eglBindAPI on the new thread as well):

EGLContext globalEglContext{};
{
    static constexpr EGLint contextAttr[]{
        EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE
    };
    globalEglContext = eglCreateContext(eglDisplay, eglConfig, EGL_NO_CONTEXT, contextAttr);
    if (EGL_NO_CONTEXT == globalEglContext)
    {    //error handling }
}

if (!eglMakeCurrent(eglDisplay, EGL_NO_SURFACE, EGL_NO_SURFACE, globalEglContext)) 
{
    printf("Error on eglMakeCurrent (error: 0x%x)", eglGetError());
}

The context creation is fine, but the eglMakeCurrent call returns false, but the error code is 0x3000 (EGL_SUCCESS):

Error on eglMakeCurrent (error: 0x3000)

I cannot ignore the error as the next EGL operation fails, so it is an error indeed.


If I execute the very same context creation code on the main thread I get:

Error on eglMakeCurrent (error: 0x3002)

Checking 0x3002 (EGL_BAD_ACCESS) in the manual doesn't explain my case.


If I create & make my EGL context current on the main thread before initializing BGFX and I add the following X11 error handler:

XSetErrorHandler(+[](Display *display, XErrorEvent *error)
{
    char buf[255];
    XGetErrorText(display, error->error_code, buf, 255);
    printf("X11 error: %s", buf);
    return 1;
}); 

Then the context creation and making it current works well, but during BGFX init I get the following error message:

X11 error: GLXBadDrawableX11 error: GLXBadDrawableX11

I have two questions:

  1. Is it possible that EGL and OpenGL contexts cannot be used in the same time? (On a thread I would have a current OpenGL context while on another thread an EGL context)
  2. If it is not possible to use OpenGL and EGL contexts in the same time not even on different threads then how could I use EGL features while I would like to continue using OpenGL as a rendering backend in the same time?

UPDATE:

I created a test app that creates and makes current a GLX context, then creates and tries to make current an EGL context and it fails.

Does it mean that EGL and OpenGL cannot be used in the same time?

The full source code (main.cpp):

#include <iostream>
#include <assert.h>
#include <thread>
#include <chrono>
#include <future>

#include <EGL/egl.h>
#include <EGL/eglext.h>

#include <GL/gl.h>
#include <GL/glx.h>

#include <SDL2/SDL.h>
#include <SDL2/SDL_syswm.h>

int main() 
{
    if (not eglBindAPI(EGL_OPENGL_API) || (eglGetError() != EGL_SUCCESS))
    {
        printf("Could not bind EGL ES API (error: 0x%0x)\n", eglGetError());
        return -1;
    }

    XSetErrorHandler(+[](Display *display, XErrorEvent *error)
    {
        char buf[255];
        XGetErrorText(display, error->error_code, buf, 255);
        printf("X11 error: %s\n", buf);
        return 1;
    });

    //
    // WINDOW
    //
    uint32_t flags = SDL_WINDOW_RESIZABLE;

    const auto sdlWindow = SDL_CreateWindow("win", 0, 0, 640, 480, flags);
    SDL_ShowWindow(sdlWindow);

    SDL_SysWMinfo wmi;
    SDL_VERSION(&wmi.version);
    if (!SDL_GetWindowWMInfo(sdlWindow, &wmi))
    {
        return -1;
    }
    auto display = wmi.info.x11.display;

    //
    // EGL INIT
    //

    void *eglConfig{};
    void *eglDisplay{};
    void *eglSurface{};

    // EGL init
    {
        // Get EGL display
        eglDisplay = eglGetDisplay((EGLNativeDisplayType)display);
        if (eglDisplay == EGL_NO_DISPLAY)
        {
            printf("Could not create EGLDisplay (error: 0x%0x)\n", eglGetError());
            return -1;
        }

        // Init EGL display
        {
            EGLint major;
            EGLint minor;
            if (!eglInitialize(eglDisplay, &major, &minor))
            {
                printf("Failed initializing EGL (error: 0x%0x)\n", eglGetError());
                return -1;
            }
            else
            {
                printf("EGL initialized (Version: %d.%d)\n", major, minor);
            }
        }

        // Choose EGL config
        {
            static constexpr EGLint cfgAttr[]{
                EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
                EGL_RENDERABLE_TYPE, EGL_OPENGL_BIT,
                EGL_RED_SIZE, 1,
                EGL_GREEN_SIZE, 1,
                EGL_BLUE_SIZE, 1,
                EGL_ALPHA_SIZE, 1,
                EGL_DEPTH_SIZE, 1,
                EGL_NONE
            };

            EGLint numConfigs{0};

            if (!eglChooseConfig(eglDisplay, cfgAttr, &eglConfig, 1, &numConfigs))
            {
                printf("Failed on eglChooseConfig (error: 0x%0x)\n", eglGetError());
                return false;
            }
        }

        // Create EGL surface
        eglSurface = eglCreateWindowSurface(eglDisplay, eglConfig, wmi.info.x11.window, nullptr);
        if(eglSurface == EGL_NO_SURFACE)
        {
            printf("Could not create EGLSurface (error: 0x%0x)\n", eglGetError());
            return -1;
        }
    }

    //
    // OpenGL context
    //
    const auto screen = DefaultScreenOfDisplay(display);
    const auto screenId = DefaultScreen(display);

    static GLint glxAttribs[] = {
        GLX_RGBA,
        GLX_DOUBLEBUFFER,
        GLX_DEPTH_SIZE,     24,
        GLX_STENCIL_SIZE,   8,
        GLX_RED_SIZE,       8,
        GLX_GREEN_SIZE,     8,
        GLX_BLUE_SIZE,      8,
        GLX_SAMPLE_BUFFERS, 0,
        GLX_SAMPLES,        0,
        None
    };
    XVisualInfo* visual = glXChooseVisual(display, screenId, glxAttribs);

    if (visual == 0)
    {
        printf("Could not create correct visual window.\n");
        return -1;
    }

    GLXContext context = glXCreateContext(display, visual, NULL, GL_TRUE);
    
    if( !glXMakeContextCurrent(display, None, None, context))
    {
        printf("Could not make context current.\n");
        return -1;
    }

    std::cout << "GL Cont: " << glXGetCurrentContext() << std::endl;
    std::cout << "EGL Cont: " << eglGetCurrentContext() << std::endl;

    /*
    // Uncomment this and EGL context creation works
    if( !glXMakeContextCurrent(display, None, None, NULL))
    {
        printf("Could not make context current.\n");
        return -1;
    }

    std::cout << "GL Cont: " << glXGetCurrentContext() << std::endl;
    std::cout << "EGL Cont: " << eglGetCurrentContext() << std::endl;
    */

    //
    // EGL CONTEXT
    //

    auto launchPolicy = std::launch::deferred; // change it to std::launch::async to create EGL context on a thread

    auto res = std::async(launchPolicy, [&](){
        void *globalEglContext;
        {
            static constexpr EGLint contextAttr[]{
                EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE
            };
            globalEglContext = eglCreateContext(eglDisplay, eglConfig, EGL_NO_CONTEXT, contextAttr);
            if (EGL_NO_CONTEXT == globalEglContext)
            {
                printf("Error creating EGL context (error: 0x%x)\n", eglGetError());
                exit(-2);
            }
        }

        // fails with 0x3000 (EGL_SUCCESS) on a different thread.
        // fails with 0x3002 (EGL_BAD_ACCESS) on the main thread.
        if (!eglMakeCurrent(eglDisplay, eglSurface, eglSurface, globalEglContext)) 
        {
            printf("Error on eglMakeCurrent (error: 0x%x)\n", eglGetError());
            exit(-3);
        }
        return 0;
    });

    res.wait();

    std::cout << "GL Cont: " << glXGetCurrentContext() << std::endl;
    std::cout << "EGL Cont: " << eglGetCurrentContext() << std::endl;
}

CMakeLists.txt:

cmake_minimum_required(VERSION 3.5)

project(EGLTest LANGUAGES CXX)

set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)

find_package(OpenGL REQUIRED COMPONENTS EGL)
find_package(PkgConfig REQUIRED)

pkg_check_modules(X11 REQUIRED x11)
pkg_check_modules(SDL2 REQUIRED sdl2)

add_executable(${PROJECT_NAME} main.cpp)

target_include_directories(
    ${PROJECT_NAME}
    SYSTEM
    PUBLIC ${OPENGL_EGL_INCLUDE_DIRS}
    PUBLIC ${SDL2_INCLUDE_DIRS}
)

target_link_libraries(
    ${PROJECT_NAME}
    OpenGL::EGL
    ${SDL2_LIBRARIES}
)

UPDATE 2: My config:

Kubuntu 22.04 LTS 5.15.0-52-generic
Operating System: Ubuntu 22.04
KDE Plasma Version: 5.24.6
KDE Frameworks Version: 5.98.0
Qt Version: 5.15.3
Kernel Version: 5.15.0-52-generic (64-bit)
Graphics Platform: X11
Processors: 16 × 11th Gen Intel® Core™ i7-11800H @ 2.30GHz

NVIDIA-SMI 470.141.03   Driver Version: 470.141.03   CUDA Version: 11.4

OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce RTX 3050 Ti Laptop GPU/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 470.141.03
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 470.141.03
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

UPDATE 3:

adam@pc:~/git/bgfx_test/build$ ldd BgfxTest | grep GL
        libEGL.so.1 => /lib/x86_64-linux-gnu/libEGL.so.1 (0x00007f32b95dd000)
        libGLX.so.0 => /lib/x86_64-linux-gnu/libGLX.so.0 (0x00007f32b95a9000)
        libGLdispatch.so.0 => /lib/x86_64-linux-gnu/libGLdispatch.so.0 (0x00007f32b8d9d000)
Broothy
  • 659
  • 5
  • 20
  • Do you fully understand that EGL and GLX are both APIs for creating and managing OpenGL Context. Judging from the source you posted, it seems that you misunderstand that only GLX can create OpenGL Context. – Mark Miller Oct 25 '22 at 05:36
  • @MarkMiller In my understanding EGL Context != OpenGL Context, but a current EGL Context can be used as a context for OpenGL operations if `eglBindAPI` was called with `EGL_OPENGL_API`. Am I right? But knowing this doesn't explain why I cannot mix EGL and OpenGL contexts (even from different threads). – Broothy Oct 25 '22 at 05:52
  • EGLContext is an opaque type representing a client API context. And Client API is the term for OpenCL, OpenGL, OpenGL ES or OpenVG in the EGL specification. In other words, EGLContext is the wrapper for OpenGL Context. So, in your code, you are creating two OpenGL contexts; the one from EGL and the other from GLX. It should be helpful for you to check EGL specification, especially for chapter 1 and 2. The spec doc is relatively easy for programmers to read. – Mark Miller Oct 25 '22 at 06:06
  • Oops, in your code, you are trying creating three OpenGL contexts; the one from GLX and the others from EGL. – Mark Miller Oct 25 '22 at 06:14
  • @MarkMiller Yes, I know that at the end both of them act as an OpenGL context. But then why I can't make a context created by `glXCreateContext` active on one thread and then another context created by `eglCreateContext` active on another thread? Or why I can't override the context activated by `glXMakeContextCurrent` by a call of `eglMakeCurrent`? And why does `eglMakeCurrent` fail with error code: 0x3000 (EGL_SUCCESS)? – Broothy Oct 25 '22 at 06:18
  • @MarkMiller "Oops, in your code, you are trying creating three OpenGL contexts; the one from GLX and the others from EGL" - Mmmmm, only two as I count. One from GLX. Then I activate it. Then I create one from EGL. Then I try to activate it. Then: 1, If the second EGL context is activated from the main thread it fails with 0x3002 (EGL_BAD_ACCESS) 2, If the second EGL context is activated from a different thread it fails with 0x3000 (EGL_SUCCESS) – Broothy Oct 25 '22 at 06:25
  • You are right about the two contexts. I just overlooked that. There are a few suspicious things in your code, but you don't seem to be doing anything strange in terms of context handling. As for what you are asking, it should basically be possible in normal cases. By the way, for the sake of experimentation, will you change `EGL_STREAM_BIT_KHR` to `EGL_WINDOW_BIT` and report back to me what results you get? If this change results in a successful result, then the problem is more likely to be EGLStreams related, not EGL in general. – Mark Miller Oct 25 '22 at 07:28
  • If it is the EGLStream matter, you may need to review what you are referring to regarding EGLStream. For example, you may need to first create EGLSurface with `eglCreateStreamProducerSurfaceKHR()` and then call `eglMakeCurrent()` with the surface. – Mark Miller Oct 25 '22 at 09:06
  • @MarkMiller Negative, changing `EGL_SURFACE_TYPE, EGL_STREAM_BIT_KHR,` to `EGL_SURFACE_TYPE, EGL_WINDOW_BIT,` doesn't have any effect. I mean the error persists. – Broothy Oct 26 '22 at 05:17
  • 1
    Oh, is that so? Actually, both works fine in my environment. That's strange. Please tell me a little more about your environment. (distro, version, desktop environment, driver, etc.) – Mark Miller Oct 26 '22 at 06:20
  • After all, you might need a valid EGLSurface for `eglMakeCurrent()`. Will you change `EGL_SURFACE_TYPE` to `EGL_WINDOW_BIT`, and create the surface with `eglCreateWindowSurface()`, and then call `eglMakeCurrent(eglDisplay, surface, surface, globalEglContext)`? Or if you can, please try the same thing for the settings with `EGL_STREAM_BIT_KHR` and `eglCreateStreamProducerSurfaceKHR()`. This [link](https://github.com/aritger/eglstreams-kms-example/blob/master/egl.c) will be helpful for you. – Mark Miller Oct 26 '22 at 16:06
  • @MarkMiller please check my updated code. Even with a surface added nothing changes. But the EGL and the GLX calls on their own are fine, please check the `// Uncomment this and EGL context creation works` section of the example code: If I remove the active GLX created context BEFORE making the EGL created context active everything starts to work. So I think somehow the EGL and GLX inter-communication is broken (at least) in my setup. – Broothy Oct 28 '22 at 05:34
  • In the first place, I think that GLX and EGL should not be used together without any special need. In this case, you should use EGL only. Why don't you try to create the two OpenGL contexts by EGL? – Mark Miller Oct 29 '22 at 07:40
  • @MarkMiller Yes, but I have no choice. I explained why I need to mix them in the first part of the question: BGFX lib uses GLX on desktop Linux on the rendering thread, and the new feature I am about to introduce needs EGL Streams on the receiving threads. – Broothy Oct 30 '22 at 06:40
  • I understand your situation. One idea I just came up with is to stop sharing a single display connection between contexts. In your case, you should use a different display for EGL than the one used for GLX. And the display should be created in a different manner than GLX's display and should be the display works with EGLStream. For details, the previous link will be helpful. Anyway, creating a different display for EGL is worth a try, I suppose. – Mark Miller Oct 30 '22 at 10:34
  • @MarkMiller I am not sure if I understand your suggestion. There is one SDL window (I cannot create more) which has an X11 display (created by SDL under the hood). To get the EGL display I call `eglGetDisplay((EGLNativeDisplayType)display);` which would return the same EGLDisplay handle on multiple calls. Then in GLX: `glXCreateContext(display, ...)` needs the native X11 display which is given by the X11 window. I don't understand how I could have different display connections. – Broothy Nov 01 '22 at 18:32
  • EGL has other ways to get EGLDisplay without using a native display created by the specic window. One is to call `eglGetDisplay(EGL_DEFAULT_DISPLAY)`, and another is to call `eglGetPlatformDisplay(...)`. It is the latter method that I suggested in one previous comment. And in that comment, I refered to this [link](https://github.com/aritger/eglstreams-kms-example/blob/master/egl.c) as a specific example of the method. If you have enough knowledge on EGLStream, you can choose the best method. If not, it would be better to read the docs on EGLStream and collect sample code first. – Mark Miller Nov 01 '22 at 23:42
  • In my opinion, EGLStream is not recommended as this API severely lacks in the documentation and working examples. Why do you need EGLStream in the first place? You should consider other buffer sharing methods. – Mark Miller Nov 02 '22 at 00:06
  • @MarkMiller I imported the `GetEglDevice` and `GetEglDisplay` functions from the example you have sent with minor mods (removed the DRM filtering). The same issue persisted for all device/display I got from the query, I got the 0x3002 (from the same thread) and 0x3000 (from a different thread) when I call `eglMakeCurrent`. I need EGLStreams to receive camera frames in GPU memory from a GStreamer pipeline with nvidia HW decoding. The GStreamer component is given and it uses EGLStream, I cannot change it, and I need to use it. – Broothy Nov 03 '22 at 17:02
  • Can you try the two EGLContext pattern (i.e. a pattern without GLX) that I suggested before? If that passes, we should be able to determine that the problem is, as you guessed, a conflict between GLX and EGL. And if the conflict is the cause, you have no choice but to unify with EGL, even if you have to abandon bgfx. – Mark Miller Nov 04 '22 at 00:17
  • @MarkMiller Sure, with two EGLContexts it works as expected. On the same thread the second `eglMakeCurrent` overrides the first one, and on a different thread I get two active contexts. I clearly see that I have no other choice than using only GLX or EGL, but I would like to know if it is a bug in my config (most probably driver bug) or it is the intended behavior? Should I open a bug ticket, or it is what it is? But in the latter case it should be documented that GLX and EGL cannot cooperate. – Broothy Nov 04 '22 at 06:39
  • Oh, I had forgotten glvnd. With glvnd, you can use GLX and EGL in the same process. To check this spec enabled in your binary, please execute `ldd EGLTest | grep GL` in the `build/` directory and add the output in your question. – Mark Miller Nov 05 '22 at 15:46
  • This [NVIDIA's blog](https://devblogs.nvidia.com/linking-opengl-server-side-rendering/) has a good explanation with this. With using cmake, you might need `GLX` for `COMPONENTS` when using FindOpenGL and add `OpenGL::OpenGL`, `OpenGL::EGL`, and `OpenGL::GLX` in target_link_libraries. – Mark Miller Nov 05 '22 at 16:02
  • @MarkMiller I missed explicitly linking GLX in CMakeLists.txt, but it got linked most probably by EGL (as you see in the lld's output). I added GLX explicitly as it was recommended, but nothing has changed. As I read this glvnd magic should happen automatically, under the hood. "The release 430 series will be the last to support installing Linux OpenGL and EGL client libraries that do not use the GL Vendor Neutral Dispatch (GLVND) loader library." - so my config should be fine. – Broothy Nov 07 '22 at 12:38
  • Judging from the ldd's result, glvnd seems to be enabled. That could be a driver's issue. You should ask about this issue on NVIDIA's related forums like [this](https://forums.developer.nvidia.com/c/gpu-graphics/145). BTW, bgfx has enough codes on EGL in its codebase, so I think that bgfx can support EGL for X11 environment. If you still want to use bgfx, you may as well post a feature request in bgfx's github issue. – Mark Miller Nov 07 '22 at 18:05

0 Answers0