The program is written in C using SDL2 and openGL. So long as I comment out
//glUseProgram(0);
the program compiles and runs and displays the glCleared color. Including the gl version checks:
const char* renderer = (const char*) glGetString(GL_RENDERER);
puts(renderer);
const char* version = (const char*) glGetString(GL_VERSION);
puts(version);
const char* glslVersion = (const char*) glGetString(GL_SHADING_LANGUAGE_VERSION);
puts(glslVersion);
Which print out:
ATI Radeon HD 5670
3.2.11927 Core Profile Context
4.20
And the gl error check:
GLenum error = glGetError();
switch(error){
case GL_NO_ERROR: puts("no error"); break;
case GL_INVALID_ENUM: puts("invalid enum"); break;
case GL_INVALID_VALUE: puts("invalid value"); break;
case GL_OUT_OF_MEMORY: puts("out of memory"); break;
case GL_INVALID_FRAMEBUFFER_OPERATION: puts("invalid framebuffer operation"); break;
default: break;
}
which prints:
no error.
But when glUseProgram(0) is uncommented I get the following error:
D:\TEMP\ccSTF4cr.o:App.c:(.text+0x320): undefined reference to 'glUseProgram
I also get:
App.c:54.2: warning: implicit declaration of function 'glUseProgram'
The included files are:
#include "SDL2/SDL.h"
#include "SDL2/SDL_opengl.h"
The program is executed using a .bat file on Windows XP. The .bat is:
del /F /Q bin\app.exe
set files=main.c App.c EventHub.c Game.c MainShader.c
set libs=-LD:\environments\minGW\mingw32\lib -lmingw32 -lopengl32 -lSDL2main -mwindows -lSDL2
set objs=bin\main.obj
gcc -Wall -o bin/app %files% %libs%
bin\app.exe
If you don't know .bats, the %file% and %libs% in the command are simply substituted for the strings in the variables.
The issue would seem to be that the context does not support the later glUseProgram function except for the fact that the context is version 3.2 which does support the function. In which case the issue seems to be around the SDL_opengl.h include is picking up a wrong -lopengl32. But frankly I don't really understand this linking stuff which why I am asking the question.