2

I have a windows build environment using cygwin and GCC, and am linking to the libraries for GLEE, GLUT, and opengl32. This is a Win32 build.

All calls to glCreateShader are returning 0, yet I'm not picking up any errors. The following is based on the Lighthouse tutorials for GLUT and GLSL, so the sequence of GL operations should be correct.

Here's the relevant code..

#define WIN32

#include <stdio.h>

#include <GL/GLee.h>
#include <GL/glut.h>

#include "SampleUtils.h"
#include "LineShaders.h"

GLint lineVertexHandle      = 0;
unsigned int lineShaderProgramID;

...

int main(int argc, char **argv) {

    // init GLUT and create window
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
    glutInitWindowPosition(100,100);
    glutInitWindowSize(320,320);
    glutCreateWindow("Lighthouse3D Tutorials");

    // register callbacks
    glutDisplayFunc(renderScene);
    glutReshapeFunc(changeSize);
    glutIdleFunc(renderScene);
        
    // initialize the shaders
    init();
    // enter GLUT event processing cycle
    glutMainLoop();
    return 0;
}

void init() {
    glClearColor( 0.0, 0.0, 0.0, 1.0 ); /* Set the clear color */

    lineShaderProgramID = SampleUtils::createProgramFromBuffer(lineMeshVertexShader,lineFragmentShader);
    
    lineVertexHandle = glGetAttribLocation(lineShaderProgramID,"vertexPosition");

}

SampleUtils is a utility class w/ the following methods for shader handling. The shaders lineMeshVertexShader and lineFragmentShader are defined in LineShaders.h.

unsigned int SampleUtils::createProgramFromBuffer(const char* vertexShaderBuffer, const char* fragmentShaderBuffer) {
    checkGlError("cPFB");

    // scroll down for initShader() - we never get past this point.
    GLuint vertexShader = initShader(GL_VERTEX_SHADER, vertexShaderBuffer);

    if (!vertexShader)
        return 0;    

    GLuint fragmentShader = initShader(GL_FRAGMENT_SHADER,
                                        fragmentShaderBuffer);
    if (!fragmentShader)
        return 0;

    GLuint program = glCreateProgram();
    if (program)
    {
        glAttachShader(program, vertexShader);
        checkGlError("glAttachShader");
        
        glAttachShader(program, fragmentShader);
        checkGlError("glAttachShader");
        
        glLinkProgram(program);
        GLint linkStatus = GL_FALSE;
        glGetProgramiv(program, GL_LINK_STATUS, &linkStatus);
        
        if (linkStatus != GL_TRUE)
        {
            GLint bufLength = 0;
            glGetProgramiv(program, GL_INFO_LOG_LENGTH, &bufLength);
            if (bufLength)
            {
                char* buf = (char*) malloc(bufLength);
                if (buf)
                {
                    glGetProgramInfoLog(program, bufLength, NULL, buf);
                    LOG("Could not link program: %s", buf);
                    free(buf);
                }
            }
            glDeleteProgram(program);
            program = 0;
        }
    }
    return program;

    }

    unsigned int
    SampleUtils::initShader(unsigned int shaderType, const char* source)
    {
    checkGlError("initShader");
    //GLuint shader = glCreateShader((GLenum)shaderType);

    /* trying explicit enum, just in case - shader is still always 0 */
    GLuint shader = glCreateShader(GL_VERTEX_SHADER);

    LOG("SHADER %i", shader);
    
    if (shader)
    {
        glShaderSource(shader, 1, &source, NULL);
        glCompileShader(shader);
        GLint compiled = 0;
        glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
    
        if (!compiled)
        {
            GLint infoLen = 0;
            glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
            if (infoLen)
            {
                char* buf = (char*) malloc(infoLen);
                if (buf)
                {
                    glGetShaderInfoLog(shader, infoLen, NULL, buf);
                    LOG("Could not compile shader %d: %s", 
                        shaderType, buf);
                    free(buf);
                }
                glDeleteShader(shader);
                shader = 0;
            }
        }
    }
    return shader;

}

void SampleUtils::checkGlError(const char* operation) { 
    for (GLint error = glGetError(); error; error = glGetError())
        LOG("after %s() glError (0x%x)", operation, error);
}

I'm wondering if the context isn't fully initialized when glCreateShader is called. But I've tried calling init() within the callbacks as well, with no effect. My searches on this issue have turned up the advice to build a known-good example, to confirm the availability of glCreateShader - if anyone has one for C++, pls advise.


UPDATE:

Based on the feedback here I'd checked my OpenGL support using the glewinfo utility and it's reporting that this system is limited to 1.1. - https://docs.google.com/document/d/1LauILzvvxgsT3G2KdRXDTOG7163jpEuwtyno_Y2Ck78/edit?hl=en_US

e.g.

---------------------------
    GLEW Extension Info
---------------------------

GLEW version 1.6.0
Reporting capabilities of pixelformat 2
Running on a GDI Generic from Microsoft Corporation
OpenGL version 1.1.0 is supported

GL_VERSION_1_1:                                                OK
---------------

GL_VERSION_1_2:                                                MISSING
---------------

etc.

What's strange is that with GLEE I was able to compile these extensions, though they apparently don't work. I've checked my gl.h and glext.h header files and they are current - the extensions are there. So how is this dealt with on Windows? How do you set up and link your environment so that you can develop w/ more than 1.1 using cygwin and Eclipse?

Rob
  • 14,746
  • 28
  • 47
  • 65
olo
  • 243
  • 2
  • 11
  • 1
    I've never used GLee, but I know that because of its unique loading methadology, it may simply return 0 if your OpenGL version doesn't support the function. So do you get an OpenGL context version 2.0 or greater? – Nicol Bolas Aug 13 '11 at 20:19
  • No glGetString(GL_VERSION) is giving me 1.1.0. But I've read that this is to be expected on Windows. I'll try GLEW and see if that helps. – olo Aug 14 '11 at 15:43
  • @olo I don't think GLEW will change that. This is only to be expected if your hardware only supports GL 1.1 or if you have a very old graphics driver (maybe the Windows default one?). Or maybe you are linking to the wrong "opengl32.dll" and not the one in your system directory?. – Christian Rau Aug 14 '11 at 16:12
  • My NVidia Driver is version 8.15.11.8593, dated 2009. This in on a newer Win 7 64bit system. I'm linking to opengl32, which should be the dll in my system32 directory. – olo Aug 14 '11 at 16:51
  • If you are getting version 1.1.0, then you are initializing OpenGL incorrectly. – Nicol Bolas Aug 14 '11 at 19:04
  • OK. I'm going to try to build a known-good example that uses these extensions to confirm that I can. Any recommendations for short demos or tutorials that use shaders? The lighthouse tutorials look like a good start. – olo Aug 14 '11 at 21:28
  • @olo So it looks like you are using the generic Windows driver and not your nVidia driver. Maybe Win7 just messed something up or your driver is not for Win7 or not installed correctly. Of course you were able to compile it with GLEE. The glext.h containing prototypes of those extension functions doesn't mean they are supported (I also have a new glext.h with 4.1 functions and a 2.1 GPU), as the extension functions need to be loaded at run-time (from the driver). What about trying the newest driver for a 64bit Win7? – Christian Rau Aug 15 '11 at 14:36
  • Yes, that makes sense. So I'm guessing that an NVidia OpenGL driver upgrade is in order (e.g. Download for Windows 7 and Vista (64-bit) @ http://developer.nvidia.com/opengl-driver ). But what opengl dll should I link to when this is installed? - is this still opengl32 or is there a new dll? – olo Aug 15 '11 at 14:54
  • @olo You of course link to the opengl32.dll from the system directory. As far as I know this is a generic DLL that just delegates everything to the specific driver implementation. – Christian Rau Aug 15 '11 at 15:16
  • The driver upgrade seems to have solved the problem. I'm showing a GL_VERSION of 2.1.2 now and the extensions that I need are available according to glewinfo - e.g. glCreateShader is returning non-0 values. So thanks everyone. – olo Aug 15 '11 at 15:54
  • BTW how do I close this Question? – olo Aug 15 '11 at 16:02

2 Answers2

4

The solution to this question was provided in the comments, and I'm highlighting it here in order to close this question.

All that was required was a driver upgrade to a version that supports the extensions that I'm using. So I installed NVidia's OpenGL driver, which can be obtained here - http://developer.nvidia.com/opengl-driver

It appears that my system's original NVidia driver was subverted so that the native windows OpenGL driver was being used. This only supports OpenGL 1.1. But I'd mistakenly thought that a GL_VERSION of 1.1.0 was normal on Windows - based on some bad advice I'd gotten. And the fact that I was able to compile and execute this code without errors led me to assume that the extensions were present. They were not.

olo
  • 243
  • 2
  • 11
  • In case anyone else has the same problem as me: I was accidentally calling glCreateShader() during glBegin() and glEnd() in some debugging code. It returned 0 and gave no error. Only a subset of OpenGL functions can be called within glBegin() and glEnd(). See http://www.opengl.org/sdk/docs/man2/xhtml/glBegin.xml – AStupidNoob Feb 12 '14 at 05:15
3

I have had the same problem, but it was a silly C++ language trick : my shader was compiled in a global / static variable (which was a wrapper class to use program shader), which was so initialized before having a GL context. Hope it can help...

Kompilor
  • 31
  • 1