1

In my (C++/OpenGL) program, I am loading a set of textures and setting the texture parameters as follows:

//TEXTURES
glGenTextures(1, &texture1);
glBindTexture(GL_TEXTURE_2D, texture1);
// set the texture wrapping parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
// set texture filtering parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);

I found out that anisotropic filtering would help me to enhance the looks on the scene. Therefore, I used this line to achieve it:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);

While I had no problems compiling this line of code on my laptop (which had AMD GPU vendor), I cannot achieve to compile this piece of code on my other computer, using Intel(R) HD Graphics 530 (Skylake GT2). Specifically, trying to compile that piece of code using g++ outputs the followin error:

error: ‘GL_TEXTURE_MAX_ANISOTROPY’ was not declared in this scope
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);

More specifically, running in my Linux terminal the following command:

glxinfo | grep -i opengl

reveals the following details about my GPU vendor and OpenGL support:

enter image description here

I understand that the ANISOTROPIC FILTERING was enabled in the ARB_texture_filter_anisotropic, but I honestly don't know how to check whether my GPU vendor supports the extension, and, if he does, how do I make it possible to use the ANISOTROPIC filtering?

BTW: I am using glfw3 and GLAD loader.

genpfault
  • 51,148
  • 11
  • 85
  • 139
Nuwanda
  • 123
  • 1
  • 6

2 Answers2

5

The anisotropic value is a floating-point value, using the f prefix: e.g.,

glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, value);

Where value is a floating-point value. It's worth noting that despite anisotropic filtering not being technically part of a GL standard, it can be considered to be a ubiquitous extension. That is, you can rely on it's existence on all platforms that matter.

If you want to clamp to some maximum anisotropy available, try something like:

GLfloat value, max_anisotropy = 8.0f; /* don't exceed this value...*/
glGetFloatv(GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT, & value);

value = (value > max_anisotropy) ? max_anisotropy : value;
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, value);
Rabbid76
  • 202,892
  • 27
  • 131
  • 174
Brett Hale
  • 21,653
  • 2
  • 61
  • 90
  • Actually, your suggestion solved all my problems. My problem was using GlTexParameteri instead of glTexParameterf. I think max value for Anisotropic filtering is 16.0f, so I have just fixed the value parameter equal to 16.0f. Also at, the beginning of the Texture loading, I am checking if the extension is supported by running: glfwExtensionSupported("GL_ARB_texture_filter_anisotropic"). After that conditional flag decides whether to use anisotropic filtering or not. Thanks a lot! – Nuwanda May 02 '19 at 11:24
  • 1
    @BrettHale: "*over-engineered C++ solution that the the OP did not ask for. There was no C++ tag in the question.*" The third word of the question says he's using C++. So C++ and its standard library is clearly applicable. Furthermore, your answer doesn't address the *title*. Rabbid's answer does. If a user is searching for this question and wants to know how to check if an extension is visible, then he won't find that answer. This is why I've changed the title to better match what the OP wants. – Nicol Bolas May 02 '19 at 13:30
  • 2
    "The OP's error was calling `glTexParameteri` rather than `glTexParameterf`" No it was not. Using this enum with the `i` variant is actually allowed [as per the spec](https://www.khronos.org/registry/OpenGL/extensions/ARB/ARB_texture_filter_anisotropic.txt). Also, the _C++ compiler_ would not generate the error message the OP got in that case. The solution was that you actually (silently) switched from `GL_TEXTURE_MAX_ANISOTROPY` (which originates from GL 4.6 / `ARB_texture_filter_anisotropic` from 2017) to the `_EXT` variant (which is ubiquitous since decades). – derhass May 02 '19 at 18:26
2
error: ‘GL_TEXTURE_MAX_ANISOTROPY’ was not declared in this scope

This GLenum value was defined in GL_ARB_texture_filter_anisotropic, which is also a core feature of OpenGL 4.6. It is not clear what mechanisms for OpenGL extension handling you are using, and if you use a particular GL loader library.

However, chances are that on your other system, the system-installed glext.h or some header of your loader like glew.h or glad.h or whatever you use, are not as recent as the ones you used on the other system. As a result, this value will not be defined.

In the case of anisotropic filtering, this is not a big issue, since the GL_EXT_texture_filter_anisotropic offers exaclty the same functionality and is around since the year 2000, so you can just switch to the constant GL_TEXTURE_MAX_ANISOTROPY_EXT. The reason this extension was so late to be promoted to ARB status and core GL functionality were some patents, which finally expired only recently.

derhass
  • 43,833
  • 2
  • 57
  • 78
  • Thank you for your suggestion. I have actually stated that I use Glad loader in the last sentence of my question. And yes, I am using `#define GLFW_INCLUDE_GLEXT #include ` Which should include glext header silently with the include of GLFW library. I guess you are right when stating that it is possible that the glext header on one of my systems might have been more recent than on the other one. Thank you. – Nuwanda May 08 '19 at 08:18
  • If you use glad, include `glad.h` first, and better don't even try to include any other `GL.h`/`glext.h` or whatever, glad does replace all these. – derhass May 08 '19 at 17:23