1

I have a C++/OpenGL render engine that uses a library to do the character animation / rendering. Each character uses a mix of textures, no textures, vertex shaders, no vertex shaders etc. I want to be able to add a tint to some of the characters, but not all of them. I think the easiest way to do this is using a fragment shader to apply the tint color to the color of the fragment. I am using Cg, as this is a requirement of the project.

The main body of my rendering engine would be something like:

  • Enable my tint fragment shader
  • Call library code to do character rendering
  • Disable my tint fragment shader

Within the shader the tint is applied by multiplying the fragment color, fragment texture and tint color. This all works fine except when no texture is enabled/bound to GL_TEXTURE_2D. I just get black. I've been able to work around this by using textureSize and checking for texture width greater than 1, but this feels fairly cheesy. Is there a better way to do this?

Also, as I have implemented it, textures are applied as though the GK_MODULAR setting were on for textures. It would be nice to know what the current OpenGL setting is and apply that instead.

genpfault
  • 51,148
  • 11
  • 85
  • 139
GiantBen
  • 51
  • 4
  • 2
    "*I am using Cg*" Um, Cg died *years* ago. You need to get with your project manager and help them move on to something more relevant. "*using textureSize*" That's not a function available to Cg, so how are you using it? – Nicol Bolas Sep 01 '20 at 03:21
  • I agree on moving past Cg. That said, I am where I am. – GiantBen Sep 01 '20 at 14:08
  • So, how do you use `textureSize` in Cg? – Nicol Bolas Sep 01 '20 at 14:09
  • I also thought textureSize would be unavialable, but it worked when I added it to my program. I am using calls cgCreateProgram and cgGLBindProgram, etc to run my program. I added the textureSize call and it worked. Maybe the Cg compiler is automatically upgrading to a newer tech on the fly? I can't say honestly. – GiantBen Sep 01 '20 at 14:10

0 Answers0