1

I have been learning about OpenGL recently, and was searching information about error handling. I learned to use glGetError to set the error flag and created a function that prints its hexadecimal value. The relevant C++ code looks like this:

GLClearError();
glDrawElements(GL_TRIANGLES, 6, GL_INT, nullptr);
GLCheckError();

with GL_INT written on purpose. GLClearError() just clears the error flag and GlCheckError() prints the errors produced by glDrawElements.

I get 500 as an output. I know I can go (in my case, as I use GLEW) to glew.h and search for the error with this number. In this case, the error is GL_INVALID_ENUM. It's a compiler definition; my question then is the following: is it possible to create in C++ a function that returns the definition name by entering its value? I use C++ 11, OpenGL 4.6 and GLEW 2.1, if that changes anything.

vrodriguez
  • 335
  • 3
  • 11

2 Answers2

1

is it possible to create in C++ a function that returns the definition name by entering its value?

Yes, it is.

Unless someone has done it already, you'll have to do it in your application code.

E.g.

enum MyEnum { ENUM_VALUE_1, ENUM_VALUE_2, ENUM_VALUE_3, ... };

std::string getEnumName(MyEnum e)
{
   static std::map<MyEnum, std::string> nameMap =
   {
      {ENUM_VALUE_1, "ENUM_VALUE_1"},
      {ENUM_VALUE_2, "ENUM_VALUE_2"},
      {ENUM_VALUE_3, "ENUM_VALUE_3"},
      ...
   };

   return nameMap[e];
}
R Sahu
  • 204,454
  • 14
  • 159
  • 270
0

Unless the library provides, or you yourself write, a function to translate the numeric error to a string representation, then it's not possible.

There is no way to, at run time, translate the value of a macro (#define) into the name of the macro.

But it's perfectly possible to write a function that takes an error code, looks it up in a table and returns some string (that you write) for each error.

Jesper Juhl
  • 30,449
  • 3
  • 47
  • 70