3

I am currently working on a OpenGL Shader class, that uses SPIR-V to compile the shaders. Getting the compiled binaries from SPIR-V works fine, the problem is, that my program crashes with the OpenGL exception GL_INVALID_OPERATION error generated. <program> has not been linked, or is not a program object. when trying to bind the Shader with glUseProgram(m_RendererID).

This is my shader creation function:

void OpenGLShader::LoadAndCreateShaders(const std::unordered_map<GLenum, std::vector<uint32>> &shaderData)
{
    if (m_RendererID)
        glDeleteProgram(m_RendererID);

    GLuint program = glCreateProgram();
    m_RendererID = program;

    std::vector<GLuint> shaderRendererIds;
    shaderRendererIds.reserve(shaderData.size());

    for (auto &[stage, data] : shaderData)
    {
        GLuint shaderId = glCreateShader(stage);
        glShaderBinary(1, &shaderId, GL_SHADER_BINARY_FORMAT_SPIR_V, data.data(), (uint32)data.size());
        glSpecializeShader(shaderId, "main", 0, nullptr, nullptr);
        glAttachShader(program, shaderId);

        GLint status;
        glGetShaderiv(shaderId, GL_COMPILE_STATUS, &status);
        if (status == GL_FALSE)
        {
            std::cout << "Shader compilation failed" << std::endl;
        }

        shaderRendererIds.emplace_back(shaderId);
    }

    // Link shader program
    glLinkProgram(program);

    int32 isLinked = 0;
    glGetProgramiv(program, GL_LINK_STATUS, (int32*)&isLinked);
    if (isLinked == GL_FALSE)
    {
        int32 maxLength = 0;
        glGetProgramiv(program, GL_INFO_LOG_LENGTH, &maxLength);

        if (maxLength > 0)
        {
            std::vector<GLchar> infoLog(maxLength);
            glGetProgramInfoLog(program, maxLength, &maxLength, &infoLog[0]);
            std::cout << "Shader linking failed: " << &infoLog[0] << std::endl;

            glDeleteProgram(program);
            for (auto id : shaderRendererIds)
                glDeleteShader(id);
        }
    }

    for (auto id : shaderRendererIds)
        glDetachShader(program, id);
}

This function basically takes in the compiled sources from SPIR-V and tries to create the OpenGL shader. When I debugged the program, I found out that after glLinkProgram() isLinked stays 0, but the maxLength variable stays also 0. So the linking process failed, but didn't give an error. Did anyone have the same issue before?

Can
  • 123
  • 2
  • 8
  • 1
    Are you sure this is the correct size to pass ```data.size() * sizeof(uint32)```? Why are you multiplying it to size of integer? – armagedescu Apr 18 '22 at 15:55
  • Because `data.size()` returns the element count of the std::vector and by multiplying it with `sizeof(uint32)` I get the size in bytes, that is that the function needs – Can Apr 18 '22 at 16:21
  • 2
    Sorry, I looked up the documentation of `glShaderBinary` and I saw that it actually needs the element count and not the size in bytes, but the same issue still remains – Can Apr 18 '22 at 16:26
  • So, ```data``` is array of integers? I am not familiar with precompiled shaders. Why not bytes? Other question, does it happen the same if you compile shaders textually? – armagedescu Apr 18 '22 at 16:39
  • @Can: What errors do you get when you specialize the shader binary? – Nicol Bolas Apr 18 '22 at 16:55
  • @NicolBolas I don't get any errors when I run the function, the program crashes when I try to bind the resulting shader – Can Apr 18 '22 at 17:03
  • @armagedescu the SPIR-V api returns a `CompilationResult` object that contains the binary data as `uint32`. The difference to textually compiled shaders is. that you have to call `glCompileShader` additionally, then OpenGL should also compile the text into binary, so I think missing `glCompileShader` in this context with loading the shader binary directly should be correct right? – Can Apr 18 '22 at 17:08
  • 1
    @Can Try to answer this question: data is array of ```integers```? Why not ```bytes```? There is something fundamental in this question. From documentation: The binary​ is the loaded SPIR-V itself, with a ```byte``` length of length​. (from here: https://www.khronos.org/opengl/wiki/SPIR-V) – armagedescu Apr 18 '22 at 17:08
  • @armagedescu Thanks for the resource, I will rewrite my function and check if it works then, but I used my function in previous projects as well and it worked well there. Is the specification of the binary data new? – Can Apr 18 '22 at 17:13
  • @armagedescu: SPIR-V is [defined as a sequence of words (32-bit LE-encoded unsigned integers)](https://www.khronos.org/registry/SPIR-V/specs/unified1/SPIRV.html#PhysicalLayout). – Nicol Bolas Apr 18 '22 at 17:14
  • @Can: "*I don't get any errors when I run the function*" I don't see you checking for errors after specializing the shader. – Nicol Bolas Apr 18 '22 at 17:16
  • @NicolBolas Yes you were right, I added the missing code to my project and to the function above but it did not print anything into the console, so the compilation should be successfull – Can Apr 18 '22 at 17:30
  • @armagedescu I changed the data type to `unsigned char`, but that does not even compile, because the SPIRV_Cross Compiler class needs to get a vector of `uint32`, as written here: https://github.com/KhronosGroup/SPIRV-Cross#using-the-c-api – Can Apr 18 '22 at 17:31
  • @Can I've seen that. The second question was if your programs behaves similarly when not using precompiled binaries, the classic way. – armagedescu Apr 18 '22 at 17:40
  • @armagedescu I did not test that, I wanted to implement SPIRV right away, that's way the classic way is currently not implemented, btw: I don't know if it helps but this is the hole class: https://github.com/HighLo-Engine/HighLo-Engine/blob/rendering/HighLo/src/Engine/Platform/OpenGL/OpenGLShader.cpp – Can Apr 18 '22 at 17:42
  • @Can: I have a hunch what might be going on here, but before I state what this hunch is, I have two questions: 1. Are you testing this with a Nvidia GPU (and thus the Nvidia OpenGL drivers)? 2. Did you compile to SPIR-V with, or without debugging symbols and reflection data? – datenwolf Apr 18 '22 at 17:44
  • @datenwolf 1. Yes, I am running a Geforce GTX 1060 with 6GB VRAM and 2. yes, I have set in the compile options to generate the debug infos as well with `shaderc::CompileOptions options; options.SetGenerateDebugInfos();`, so that should activate the generation of the debug infos, right? – Can Apr 18 '22 at 17:48
  • @Can: Yes. And this invalidates my hunch (FYI, the Nvidia OpenGL driver isn't conformant with respect to loading SPIR-V binaries that don't have or were stripped of their debugging information. I ran into that problem some time ago, can reproduce it, but yet have to implement a proper MVCE for a bug report). – datenwolf Apr 18 '22 at 17:59
  • @datenwolf ahh good to know, thank you, so I always should generate the debug infos then to be safe, right? What about the release build? Does it work without the debug infos there? – Can Apr 18 '22 at 18:17

0 Answers0