1

I've tried everything to get OpenGL 3.2 to render with CG shaders in my game engine but I have had no luck. So I decided to make a bare minimal project but still shaders won't work. In theory my test project should just render a red triangle but it is white because the shader is not doing anything.

I'll post the code here:

#include <stdio.h>
#include <stdlib.h>
#include <vector>
#include <string>
#include <GL/glew.h>
#include <Cg/cg.h>
#include <Cg/cgGL.h>
#include <SDL2/SDL.h>

int main()
{
    SDL_Window *mainwindow;
    SDL_GLContext maincontext;

    SDL_Init(SDL_INIT_VIDEO);

    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);

    mainwindow = SDL_CreateWindow("Test", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 512, 512, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);

    maincontext = SDL_GL_CreateContext(mainwindow);

    glewExperimental = GL_TRUE;
    glewInit();

    _CGcontext* cgcontext;
    cgcontext = cgCreateContext();
    cgGLRegisterStates(cgcontext);

    CGerror error;
    CGeffect effect;
    const char* string;
    std::string shader;

    shader =
            "struct VS_INPUT"
            "{"
            "   float3 pos              : ATTR0;"
            "};"

            "struct FS_INPUT"
            "{"
            "   float4 pos                  : POSITION;"
            "   float2 tex                  : TEXCOORD0;"
            "};"

            "struct FS_OUTPUT"
            "{"
            "   float4 color                : COLOR;"
            "};"

            "FS_INPUT VS( VS_INPUT In )"
            "{"
            "   FS_INPUT Out;"
            "   Out.pos = float4( In.pos, 1.0f );"
            "   Out.tex = float2( 0.0f, 0.0f );"
            "   return Out;"
            "}"

            "FS_OUTPUT FS( FS_INPUT In )"
            "{"
            "   FS_OUTPUT Out;"
            "   Out.color = float4(1.0f, 0.0f, 0.0f, 1.0f);"
            "   return Out;"
            "}"

            "technique t0"
            "{"
            "   pass p0"
            "   {"
            "      VertexProgram = compile gp4vp VS();"
            "      FragmentProgram = compile gp4fp FS();"
            "   }"
            "}";

    effect = cgCreateEffect(cgcontext, shader.c_str(), NULL);
    error = cgGetError();
    if(error)
    {
        string = cgGetLastListing(cgcontext);
        fprintf(stderr, "Shader compiler: %s\n", string);
    }

    glClearColor ( 0.0, 0.0, 1.0, 1.0 );
    glClear ( GL_COLOR_BUFFER_BIT );

    float* vert = new float[9];

    vert[0] = 0.0; vert[1] = 0.5; vert[2] =-1.0;
    vert[3] =-1.0; vert[4] =-0.5; vert[5] =-1.0;
    vert[6] = 1.0; vert[7] =-0.5; vert[8]= -1.0;

    unsigned int m_vaoID;
    unsigned int m_vboID;

    glGenVertexArrays(1, &m_vaoID);
    glBindVertexArray(m_vaoID);

    glGenBuffers(1, &m_vboID);

    glBindBuffer(GL_ARRAY_BUFFER, m_vboID);
    glBufferData(GL_ARRAY_BUFFER, 9 * sizeof(GLfloat), vert, GL_STATIC_DRAW);

    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
    glEnableVertexAttribArray(0);

    CGtechnique tech = cgGetFirstTechnique( effect );
    CGpass pass = cgGetFirstPass(tech);
    while (pass)
    {
        cgSetPassState(pass);
        glDrawArrays(GL_TRIANGLES, 0, 3);
        cgResetPassState(pass);
        pass = cgGetNextPass(pass);
    }

    glDisableVertexAttribArray( 0 );

    glBindVertexArray(0);

    delete[] vert;

    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glDeleteBuffers(1, &m_vboID);
    glDeleteVertexArrays(1, &m_vaoID);

    SDL_GL_SwapWindow(mainwindow);
    SDL_Delay(2000);

    SDL_GL_DeleteContext(maincontext);
    SDL_DestroyWindow(mainwindow);
    SDL_Quit();

    return 0;
}

What am I doing wrong?

Juan Mellado
  • 14,973
  • 5
  • 47
  • 54
SteveDeFacto
  • 1,317
  • 4
  • 19
  • 35

2 Answers2

5

I compiled the code and got the same result. So I added a CG error handler to get a bit more of information:

void errorHandler(CGcontext context, CGerror error, void * appdata) {
    fprintf(stderr, "%s\n", cgGetErrorString(error));
}
...
cgSetErrorHandler(&errorHandler, NULL);

When cgSetPassState and cgResetPassState were called I got the following error message:

Technique did not pass validation.

Not really very informative, of course. So I used GLIntercept to trace all OpenGL calls to a log file.

This time, when glewInit was called I got the following error message in the log file:

glGetString(GL_EXTENSIONS)=NULL glGetError() = GL_INVALID_ENUM

According OpenGL documentation, glGetString must not be called with GL_EXTENSIONS, was deprecated in 3.0, and glGetStringi must be used instead.

Finally, I found the issue in the GLEW library: http://sourceforge.net/p/glew/bugs/120/

I removed GLEW dependency and tested with gl3.h (and more recent glcorearb.h). I got the same error, but this time when cgGLRegisterStates was called.

I also tried CG trace.dll, just to get the same error (7939 = 0x1F03 = GL_EXTENSIONS):

glGetString
  {
  input:
    name = 7939
  output:
    return = NULL
  }

Then, I tested OpenGL 3.1 (SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);), and found that it was working fine:

glGetString(GL_EXTENSIONS)="GL_AMD_multi_draw_indirec..."

That is, the 3.1 context was compatible with previous OpenGL versions, but 3.2 not.

After a bit of Internet digging I found that you can create this type of compatible OpenGL context with SDL, just adding this line to the code:

SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_COMPATIBILITY);

IMHO, CG Toolkit needs this type of compatibility profile.

Juan Mellado
  • 14,973
  • 5
  • 47
  • 54
  • I don't want to create a compatibility profile. I already knew I could do that. I tried it with the GL3 header and it still does not work. I don't think the problem is with GL_EXTENSIONS but it sounds like you may be on to something with the "Technique did not pass validation." I need a solution that uses a pure core profile. – SteveDeFacto Dec 29 '12 at 02:22
  • @SteveDeFacto To me, the "Technique did not pass validation" error appears because the GLEW library fails and then `glCreateProgram`, `glCreateShader`, ... are never called. You can test it with the `cgIsTechniqueValidated` function. To use a core profile I will need to download GLEW source code, apply a patch and test again. By the way, have you tried GLIntercept, or similar? – Juan Mellado Dec 29 '12 at 03:02
  • Try this instead of glew: http://www.opengl.org/registry/api/gl3.h It was made for OpenGL core profile and still does not work. – SteveDeFacto Dec 29 '12 at 03:07
  • @SteveDeFacto I have tested gl3.h (and more recent glcorearb.h) to get the same error, but this time from CG, no from GLEW. IMHO, CG Toolkit needs this type of compatibility profile to work. I have updated the answer. – Juan Mellado Dec 29 '12 at 17:18
  • Where did you read that CG needs a compatibility profile? CG has the ability to compile up to shader version 5. Why would it need a compatibility profile? – SteveDeFacto Dec 30 '12 at 02:51
  • @SteveDeFacto I said "IMHO, [I think that] ...", not "I have read that..., here is the link". Anyway, I have not found any reference to `glGetStringi` inside cgGL.dll, and all official CG examples seem to use a compatibility profile from glut. Take at look at slide number 97 of this [NVIDIA talk conference](http://www.slideshare.net/Mark_Kilgard/gtc-2010-opengl) – Juan Mellado Dec 30 '12 at 11:13
  • 1
    @SteveDeFacto: "Why would it need a compatibility profile?" Because NVIDIA *hates* the core profile. They have been outspoken against the idea of removing things from OpenGL and they tell people in their SDK to use the compatibility profile. Such a company is highly unlikely to make any effort in making their external Cg compiler core profile safe. – Nicol Bolas Dec 31 '12 at 04:41
  • I find it hard to believe that there is nothing about CG requiring a compatibility profile anywhere on the internet. One would think at least a few people would have complained about the problem I'm having. This sounds more like a wild guess based on little if any evidence. If anyone can find an official statement or at least a few forum posts complaining about this issue, I will consider the compatibility profile to be the solution. Until then I'm holding out hope that Juan is wrong. – SteveDeFacto Dec 31 '12 at 05:30
  • I've been trying to find a compiler option for cgc to make it compile shaders that work with the OpenGL 3.2 core profile but I have had no luck. At this point even if Nvidia CG can work with a core profile, it may as well not because there is not enough documentation for anyone to know how. I'm just going to reward this bounty to Juan since he tried to find an answer. – SteveDeFacto Jan 01 '13 at 08:48
0

"Cg 3.1 context does not yet support forward-compatible OpenGL contexts!"

Source: http://3dgep.com/introduction-to-shader-programming-with-cg-3-1/

As the Cg project seems to be abandoned, it's also not likely to happen.

darklon
  • 468
  • 3
  • 13