In my (C++/OpenGL) program, I am loading a set of textures and setting the texture parameters as follows:
//TEXTURES
glGenTextures(1, &texture1);
glBindTexture(GL_TEXTURE_2D, texture1);
// set the texture wrapping parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
// set texture filtering parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
I found out that anisotropic filtering would help me to enhance the looks on the scene. Therefore, I used this line to achieve it:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);
While I had no problems compiling this line of code on my laptop (which had AMD GPU vendor), I cannot achieve to compile this piece of code on my other computer, using Intel(R) HD Graphics 530 (Skylake GT2). Specifically, trying to compile that piece of code using g++ outputs the followin error:
error: ‘GL_TEXTURE_MAX_ANISOTROPY’ was not declared in this scope
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);
More specifically, running in my Linux terminal the following command:
glxinfo | grep -i opengl
reveals the following details about my GPU vendor and OpenGL support:
I understand that the ANISOTROPIC FILTERING was enabled in the ARB_texture_filter_anisotropic
, but I honestly don't know how to check whether my GPU vendor supports the extension, and, if he does, how do I make it possible to use the ANISOTROPIC filtering?
BTW: I am using glfw3 and GLAD loader.