0

I have the following piece of shader code that works perfectly with GLSL #130, but I would like to convert it to code that works with version #330 (as somehow the #130 version doesn't work on my Ubuntu machine with a Geforce 210; the shader does nothing). After several failed attempts (I keep getting undescribed link errors) I've decided to ask for some help. The code below dynamically changes the contrast and brightness of a texture using the uniform variables Brightness and Contrast. I have implemented it in Python using PyOpenGL:

def createShader():
     """
     Compile a shader that adjusts contrast and brightness of active texture
     Returns
         OpenGL.shader - reference to shader
         dict - reference to variables that can be passed to the shader
     """

     fragmentShader = shaders.compileShader("""#version 130
     uniform sampler2D Texture;
     uniform float Brightness;
     uniform float Contrast;
     uniform vec4 AverageLuminance;

     void main(void)
     {
         vec4 texColour = texture2D(Texture, gl_TexCoord[0].st);
         gl_FragColor = mix(texColour * Brightness, 
                        mix(AverageLuminance, texColour, Contrast), 0.5);
     }
     """, GL_FRAGMENT_SHADER)
     shader = shaders.compileProgram(fragmentShader)

     uniform_locations = {
          'Brightness': glGetUniformLocation( shader, 'Brightness' ),
          'Contrast': glGetUniformLocation( shader, 'Contrast' ),
          'AverageLuminance': glGetUniformLocation( shader, 'AverageLuminance' ),
          'Texture': glGetUniformLocation( shader, 'Texture' )
     }
     return shader, uniform_locations

I've looked up the changes that need to made for the new GLSL version and tried changing the fragment shader code to the following, but then only get non-descriptive Link errors:

fragmentShader = shaders.compileShader("""#version 330
uniform sampler2D Texture;
uniform float Brightness;
uniform float Contrast;
uniform vec4 AverageLuminance;
in vec2 TexCoord;
out vec4 FragColor;

void main(void)
{
    vec4 texColour = texture2D(Texture, TexCoord);
    FragColor = mix(texColour * Brightness, 
                mix(AverageLuminance, texColour, Contrast), 0.5);
}
""", GL_FRAGMENT_SHADER)

Is there anyone that can help me with this conversion?

genpfault
  • 51,148
  • 11
  • 85
  • 139
Daniel Schreij
  • 773
  • 1
  • 10
  • 26
  • What about the corresponding vertex shader? – yiding Jan 15 '13 at 11:24
  • 1
    I doubt that raising the shader version profile will solve any issue. `#version 330` is OpenGL-3.3 and according to the NVidia product website the maximum OpenGL version supported by the GeForce 210 is OpenGL-3.1, i.e. `#version 140` – datenwolf Jan 15 '13 at 11:31
  • @yiding: I created no vertex shader cause I didn't think I'd need one (I wouldn't know what I should make it do). It worked before without any vertex shader as well. – Daniel Schreij Jan 15 '13 at 12:46
  • @datenwolf: the glGetString(GL_VERSION) on the NVidia machine reads out OpenGL version 3.3.0. This is Ubuntu, so it might be possible that it differs with the windows specifications? – Daniel Schreij Jan 15 '13 at 12:46
  • 1
    @DanielS: What drivers do you have installed? The NVidia proprietary or the MesaGL ones? `glGetString(GL_RENDERER);` Also with OpenGL-3 and beyond (in core profile) you must supply at least a vertex and a fragment shader. – datenwolf Jan 15 '13 at 13:30

1 Answers1

1

I doubt that raising the shader version profile will solve any issue. #version 330 is OpenGL-3.3 and according to the NVidia product website the maximum OpenGL version supported by the GeForce 210 is OpenGL-3.1, i.e. #version 140

I created no vertex shader cause I didn't think I'd need one (I wouldn't know what I should make it do). It worked before without any vertex shader as well.

Probably only as long as you didn't use a fragment shader or before you were attempting to use a texture. The fragment shader needs input variables, coming from a vertex shader, to have something it can use as texture coordinates. TexCoord is not a built-in variable (and with higher GLSL versions any builtin variables suitable for the job have been removed), so you need to fill that with value (and sense) in a vertex shader.

the glGetString(GL_VERSION) on the NVidia machine reads out OpenGL version 3.3.0. This is Ubuntu, so it might be possible that it differs with the windows specifications?

Do you have the NVidia propriatary drivers installed? And are they actually used? Check with glxinfo or glGetString(GL_RENDERER). OpenGL-3.3 is not too far from OpenGL-3.1 and in theory OpenGL major versions map to hardware capabilities.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • First of all, thanks for your help! I use the proprietary drivers at the moment as the MesaGL ones at one day just stopped working and I couldn't get this fixed. I'm pretty new to OpenGL and specifically shaders (I learned OpenGL ages ago before shaders became a big thing). I followed the tutorial at http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml to create this piece of code, but frankly only have a gist of what I'm really doing. All tutorials on GLSL and Python I can find are outdated with respect to the modern GLSL versions. – Daniel Schreij Jan 15 '13 at 13:58
  • Would VERTEX_SHADER = shaders.compileShader("""#version 330 void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; }""", GL_VERTEX_SHADER) for a basic vertex shader that does nothing but pass on its input be sufficient? – Daniel Schreij Jan 15 '13 at 14:03
  • @DanielS: If you're using a compatibility profile context, then yes. Note that yoiu must explicitly request a compatibility version profile, as the default is core, i.e. write `#version 330 compatibility`. `gl_ModelViewProjectionMatrix` is a compatibility profile built-in variable. I strongly recommend not using the builtin matrix stack, though. – datenwolf Jan 15 '13 at 14:46
  • Ah I see, these variables also do not exist anymore by default. I'd like my shader program to be native and not run in compatbility mode. I think I'm just going to look for more sources on the newer GLSL versions and see if I can start from scratch again. Thanks for all the input! – Daniel Schreij Jan 15 '13 at 22:20