7

I am trying to use OpenGL and write the output directly to the Framebuffer on an Imx6 processor (cubox) The reason for this is I am trying to avoid using X11

I used an example application from https://github.com/benosteen/opengles-book-samples/blob/master/Raspi/Chapter_2/Hello_Triangle/Hello_Triangle.c

I modified the code to Not include any X11 header files or Window create functions, and I replaced surface = eglCreatePixmapSurface(display, config, (EGLNativePixmapType)Hwnd, NULL); with surface = eglCreatePbufferSurface(display, config, surfaceAttribs);

Then I output the OpenGl pixels directly into the framebuffer using glReadPixels(..)

When I run the application using VisualGDB, the application works well and the output its shown directly on the screen by writing to /dev/fb0

However, when I run from console, using ./OpenGL_Test_IMX I get the following errors:

The framebuffer device was opened successfully.
The framebuffer device was opened successfully.
1920x1200, 32bpp
The framebuffer device was mapped to memory successfully.
libEGL warning: DRI2: xcb_connect failed
libEGL warning: DRI2: xcb_connect failed
libEGL warning: GLX: XOpenDisplay failed
Segmentation fault

The code is for my Egl initialization is shown below. I took out error checking to make it shorter.

The code fails on eglInitialize(display, &majorVersion, &minorVersion) Segmentation fault is the output of eglGetError()

EGLBoolean CreateEGLContext(EGLNativeWindowType hWnd, EGLDisplay* eglDisplay,
EGLContext* eglContext, EGLSurface* eglSurface,
EGLint attribList[])
{
     EGLint numConfigs;
     EGLint majorVersion;
     EGLint minorVersion;
     EGLDisplay display;
     EGLContext context;
     EGLSurface surface;
     EGLConfig config;
     EGLint contextAttribs[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE, EGL_NONE };

     // Get Display
     display = eglGetDisplay(EGL_DEFAULT_DISPLAY);

     // Initialize EGL
     if (!eglInitialize(display, &majorVersion, &minorVersion))
     {
        printf(eglGetError());
        return EGL_FALSE;
     }

     // Get configs
     eglGetConfigs(display, NULL, 0, &numConfigs)
     // Choose config
     eglChooseConfig(display, attribList, &config, 1, &numConfigs)

     int surfaceAttribs[] = {
         EGL_WIDTH, 1900,
         EGL_HEIGHT, 1088,
         EGL_NONE
     };
     surface = eglCreatePbufferSurface(display, config, surfaceAttribs);
     //surface = eglCreatePixmapSurface(display, config, (EGLNativePixmapType)Hwnd, NULL);

     // Create a GL context
     context = eglCreateContext(display, config, EGL_NO_CONTEXT, contextAttribs);

     // Make the context current
     eglMakeCurrent(display, surface, surface, context)

     *eglDisplay = display;
     *eglSurface = surface;
     *eglContext = context;
     return EGL_TRUE;
 }

My last question is whether EGL is necessary to run OpenGL, or is it possible to run OpenGL and just have it write to a memory location?

If it's not possible to run without X11, how can I start just the minimum (ex: Xorg server) and have it use that as context? Can anyone provide a some help (or a command) as to how I can start the program with X11 Context?

Maybe I am not starting the application correctly? For example, do I need to start an EGL server beforehand?

Lastly, here is a pastebin to all my running processes when I start the application from VisualGDB http://pastebin.com/t8hgWthx and here is a pastebin when I do not, http://pastebin.com/PvdpQqT8 I notice some differences but I don't see anything specific that is necessary to launch the application with the X11 context.

genpfault
  • 51,148
  • 11
  • 85
  • 139
Mich
  • 3,188
  • 4
  • 37
  • 85
  • [Here](https://github.com/robclark/kmscube/blob/master/kmscube.c) is a working example with a gbm surface as backend, It use libgbm, and libdrm to initiate the required display for eglInitialize. – j-p Jan 03 '15 at 09:34

0 Answers0