8

Specifying the GLSL version gives a syntax error when using LWJGL. I haven't tried to reproduce this issue outside LWJGL. This is happening on multiple Macs running Lion.

I've gotten both vertex and fragment shaders to work without using #version. But I'm about to use the texture function, which seems to require a #version directive.

Here's the simplest failing example:

#version 120

void main() {
  gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Compiling this fragment shader and calling glGetShaderInfoLog gives this error:

ERROR: 0:1: '' : syntax error #version

Replacing 120 with anything else, such as 110, also gives an error. Curiously, though, if I use 130 or higher, it gives the same error plus a complaint about the version not beig supported. (i know my system doesn't have GLSL 1.3, but it's still weird that this error displays when the compiler is acting like it doesn't understand the version tag.)

I'm on a Mac with an ATI Radeon HD 4670. GL_VERSION is 2.1 ATI-7.12.9 and GL_SHADING_LANGUAGE_VERSION is 1.20.

Given that, I don't see any reason why GLSL 1.20 should be unavailable. And it's really weird to me that it's saying #version is a syntax error, as opposed to saying something about an unsupported GLSL version.

rlkw1024
  • 6,455
  • 1
  • 36
  • 65

2 Answers2

17

Solved! It had nothing to do with OpenGL. My file reader code was dropping all line breaks. This was fine in the body of the shader, which had semicolons. But the preprocessor directive had no semicolon to protect it from this error.

So for anyone with this problem, make sure the code you're actually passing to glShaderSource still has its linebreaks.

rlkw1024
  • 6,455
  • 1
  • 36
  • 65
  • 1
    Thanks a lot, you've saved my night! I'm on Mac with SDL and C++, but with this obscure error message OpenGL was trying to tell me the same thing. – zubko Jan 03 '12 at 23:43
  • A way to approach this may be found [here](http://schabby.de/opengl-shader-example/). The BufferedReader pulls the shader source in line by line, discarding CR/LF characters. You then explicitly reappend a newline character to each line using the StringBuilder (see about 2/3 of the way down that page). – Engineer Nov 12 '12 at 10:40
  • Thanks a lot as well. This was the exact issue I was having. Pesky Java IO removing line breaks! – lcmylin Jul 30 '15 at 20:37
1

Both the vertex and the fragment shader need to have the same version. So if you add a #version 120 to the fragment shader, you should also add it to the vertex shader, too. But it's a bit strange that this is reported as a syntax error. Maybe there's another error, but both definitely have to have the same version tag.

EDIT: Also keep in mind that the version tag needs to be the first line in the shader source code (newlines and comments should be Ok by specification, but who knows what the drivers think).

Christian Rau
  • 45,360
  • 10
  • 108
  • 185
  • I tried using the same version in both, but no luck. Since this happens in compilation and not linking, I don't think the shaders know about each other anyway. – rlkw1024 Dec 16 '11 at 23:48
  • @Jarrett Considering the ominous vendor of your graphics card, it may also be a driver bug. Though this is really a very simple feature and should be no problem to support, but then again, it's still ATI. – Christian Rau Dec 16 '11 at 23:54
  • I jusy verified the same issue with a GeForce 320M. So I'm starting to doubt that both vendors have the exact same bug in a basic feature. – rlkw1024 Dec 16 '11 at 23:59