1

This one is a bit weird, but I will start at the beginning:

As far as I gathered, there are 3 ways to open up an OpenGL window in Haskell: GLUT, GLFW and SDL. I don't want to use GLUT at all, because it forces you to use IORefs and basically work in the IO monad only. So I tried GLFW and made a little thingie on my laptop, which uses Xubuntu with the XFCE desktop system.

Now I was happy and copied it to my desktop, a fairly fresh installed standard Ubuntu with Unity, and was amazed to see nothing. The very same GLFW code that worked fine on the laptop was caught in an endless loop before it opened the window.

Right then I ported it all to SDL. Same code, same window, and SDL crashes with

Main.hs: user error (SDL_SetVideoMode
SDL message: Couldn't find matching GLX visual)

I have checked back with SDLgears, using the same method to open a window, and it works fine. Same with some other 3D application, and OpenGL is enabled fine.

What baffles me is that it works under a XUbuntu but not on an Ubuntu. Am I missing something here? Oh, and if it helps, the window opening function:

runGame w h (Game g) = withInit [InitVideo] $ do
    glSetAttribute glRedSize 8
    glSetAttribute glGreenSize 8
    glSetAttribute glBlueSize 8
    glSetAttribute glAlphaSize 8
    glSetAttribute glDepthSize 16
    glSetAttribute glDoubleBuffer 1

    _ <- setVideoMode w h 32 [OpenGL, Resizable]

    matrixMode $= Projection
    loadIdentity
    perspective 45 (fromIntegral w / fromIntegral h) 0.1 10500.0
    matrixMode $= Modelview 0
    loadIdentity

    shadeModel $= Smooth
    hint PerspectiveCorrection $= Nicest

    depthFunc $= Just Lequal
    clearDepth $= 1.0

    g
Lanbo
  • 15,118
  • 16
  • 70
  • 147

1 Answers1

3

This error message is trying to tell you that your combination of bit depths for the color, depth and alpha buffers (a "GLX visual") is not supported. To see which ones you can use on your system, try running glxinfo.

$ glxinfo
...

65 GLX Visuals
    visual  x   bf lv rg d st  colorbuffer  sr ax dp st accumbuffer  ms  cav
  id dep cl sp  sz l  ci b ro  r  g  b  a F gb bf th cl  r  g  b  a ns b eat
----------------------------------------------------------------------------
0x023 24 tc  0  32  0 r  y .   8  8  8  8 .  .  0 24  8 16 16 16 16  0 0 None
0x024 24 tc  0  32  0 r  . .   8  8  8  8 .  .  0 24  8 16 16 16 16  0 0 None
0x025 24 tc  0  32  0 r  y .   8  8  8  8 .  .  0 24  0 16 16 16 16  0 0 None
0x026 24 tc  0  32  0 r  . .   8  8  8  8 .  .  0 24  0 16 16 16 16  0 0 None
0x027 24 tc  0  32  0 r  y .   8  8  8  8 .  .  0 24  8  0  0  0  0  0 0 None
...
hammar
  • 138,522
  • 17
  • 304
  • 385
  • 1
    The attributes passed to glXChooseFBConfig select minium required features. So selecting for a 16 bit depth buffer may return a 24 bit depth buffer. There's no result returned only if no mode matches the minimum requirements. Or Haskell GLFW selectes for exactly the requested config which can happen, too. – datenwolf Sep 30 '11 at 20:27
  • I must admit that I cannot quite read that table, but I have tried to fiddle with the settings on `glSetAttribute`, but no avail. – Lanbo Sep 30 '11 at 21:45
  • You probably want to fiddle with the settings on `setVideoMode` in addition to the settings on `glSetAttribute` -- if, like hammar, you only have a 24-bit display, then asking for a 32-bit display is bound to fail. – Daniel Wagner Sep 30 '11 at 21:51
  • I have set the `bpp` parameter of `setVideoMode` to 0, which means current desktop settings, and removed all the `glSetAttribute` calls, but no avail again. – Lanbo Sep 30 '11 at 22:07