0

I've been trying to use OpenGL 4 and the first obstacle was actually loading GL4 instead of the disgusting software-only GL 1.1 that comes with MS Windows. I tried using GLEW and then updated my drivers a second time, and still GL continued putting the version as 1.1.0. It turns out that it was not a problem with GLEW (nor did it even require GLEW, it seems), nor was it SDL breaking something internally (which I considered since that is what I use to create the context). Andon M. Coleman brought up the issue of pixel format, which is something I had totally overlooked to begin with. It turns out that I'd been using 8 bits for each of red, green, blue, and alpha, plus 32 bits for depth. I thought that was a nice even 64 bits, with 32 for the color and 32 for the distance. However, since SDL assumes you want a stencil buffer too (which I realize now is actually needed), it was actually making the pixel format 72 bits, which is not allowed for an accelerated context, since GPUs typically handle 64 bits at most. So, it was defaulting to the ancient GL 1.1 support provided by Windows for use in the absence of drivers, also making it software-only.

The code has a lot of other stuff in it, so I have put together a basic sample of what I was trying to do. It is valid C++ and compiles on MinGW, assuming you link it properly.

#include <gl\gl.h>
#include <SDL2\SDL.h>
#include <stdio.h>
#define SDL_main main
int main ()
{
    SDL_Init(SDL_INIT_EVERYTHING);
    SDL_GL_SetAttribute(SDL_GL_RED_SIZE,8);
    SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,8);
    SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,8);
    SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE,8);

    /// What I *was* doing...
    /* SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,32); */
    // And then I didn't even set SDL_STENCIL_SIZE at all.
    // Results in the entire size for each pixel being more than 64.

    /// What I *am* doing...
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,24);
    SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE,8);
    // Nice even 32 bits for this and 32 bits for the color is 64.

    SDL_Window* window = SDL_CreateWindow(
        "Hay GPU y is u no taek pixelz ovar 64 bitz?",
        SDL_WINDOWPOS_UNDEFINED,SDL_WINDOWPOS_UNDEFINED,
        1024,768,
        SDL_WINDOW_FULLSCREEN | SDL_WINDOW_OPENGL
    );
    SDL_GLContext context = SDL_GL_CreateContext(window);
    printf("GL Version [%s]\n",glGetString(GL_VERSION));
    SDL_DestroyWindow(window);
    SDL_GL_DeleteContext(context);
    SDL_Quit();
    return 0;
};

Hopefully other people who have been having a similar issue can learn from my mistake, or at least be able to mark it off a list of possible problems.

genpfault
  • 51,148
  • 11
  • 85
  • 139
rsethc
  • 2,485
  • 2
  • 17
  • 27
  • That is not even ***close*** to what is relevant. If you are getting an OpenGL 1.1 context, then the parameters you used when you setup your pixel format are **very** important (if you pick ones not supported by hardware, you get the reference GDI implementation), and missing from your question. – Andon M. Coleman Jan 19 '14 at 19:11
  • So, what should I look for that may be causing this, then? – rsethc Jan 19 '14 at 19:37
  • The parameters you told SDL to use when you created your render context, as I mentioned in my comment. It would help a lot if you would include that information in the question rather than concluding that it is irrelevant and reducing the code to what you have now. – Andon M. Coleman Jan 19 '14 at 19:40
  • Ah, I really never thought about that since I actually have been creating my contexts through SDL. I'm going to try creating a context outside and then re-attaching it, and see if that works. – rsethc Jan 19 '14 at 21:56
  • I may have found the problem... this whole time I've been telling SDL to set up a 32-bit depth buffer. I just replaced the value with 16 to see what would happen, and the frame rate increased dramatically, so I'm guessing I have been rendering on the CPU only due to unsupported depth value size, and also the glGetString(GL_VERSION) now says "4.1.10834 Compatability Profile Context" rather than `1.1.0`. – rsethc Jan 19 '14 at 22:02
  • I think the "Compatability" part of that is simply there because my GPU is rather low-end, so I am not worried about that. What still bothers me, though only slightly, is that GLEW still says `1.10`. It isn't really worrying as much as it is just strange. – rsethc Jan 19 '14 at 22:03
  • Indeed. That is why I was trying to get you to list the parameters you passed to SDL :) This is a common mistake. 24-bit Depth (+ 8-bit Stencil) is often the highest that is supported without using FBOs. You can get a 32-bit (floating-point) depth buffer on your hardware, but you will need to use an FBO. – Andon M. Coleman Jan 19 '14 at 22:03
  • As for compatibility, no. That is because you have not created a **core** profile context. Having a compatibility profile means you can still use old parts of OpenGL that are deprecated. If you had a core profile, a lot of parts of GL would no longer work. – Andon M. Coleman Jan 19 '14 at 22:05
  • Can you update your question to show the SDL initialization code? It appears that the issue has been resolved, and it would help people in the future if you were to show the 32-bit depth buffer issue. – Andon M. Coleman Jan 19 '14 at 22:10
  • Yes, I will totally revise the question. Thank you, your advice has been extremely helpful! – rsethc Jan 19 '14 at 22:13

1 Answers1

0

From SDL docs:

SDL_GL_CONTEXT_PROFILE_MASK determines the type of context created, while both SDL_GL_CONTEXT_MAJOR_VERSION and SDL_GL_CONTEXT_MINOR_VERSION determine which version. All three attributes must be set prior to creating the first window, and in general you can't change the value of SDL_GL_CONTEXT_PROFILE_MASK without first destroying all windows created with the previous setting.

Try setting the right values for SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_MAJOR_VERSION and SDL_GL_CONTEXT_MINOR_VERSION, that will probably solve the issue (if your drivers actually support OpenGL 4.x contexts)

gabomdq
  • 1,730
  • 13
  • 10
  • I have never had to adjust these, it seems that the defaults are just fine. The problem I was having earlier is already resolved anyway. – rsethc Jan 20 '14 at 02:16