1

What would be the best way to send my bit flag to the fragment shader in order to be able to if() against it?

I have the following bit flag (enum):

uint32_t options;

enum Options {
    ON                  = 1 << 0,   /// row 1 | enable or disable
    OFF                 = 1 << 1,
    DUMMY1              = 1 << 2,
    DUMMY2              = 1 << 3,
    NONE                = 1 << 4,   /// row 2 | contours
    SILHOUETTE          = 1 << 5,
    SUGGESTIVE          = 1 << 6,
    APPARENTRIDGES      = 1 << 7,
    PHOTOREALISTIC      = 1 << 8,   /// row 3 | shading
    TONE                = 1 << 9,
    TONESPLASHBACK      = 1 << 10,
    EXAGGERATED         = 1 << 11
    };

Corresponding to the table [] = place i,j in table ([bit as int])

[1]   [2]   [4]    [8]
[16]  [32]  [64]   [128]
[256] [512] [1024] [2048]

So a possible value for my bitflag (options) when of every row in my actual table all first options are selected would give me a value of 273. In every row only one option can be selected.

Now when I want to check what options are enabled on the CPU using the bitflag I can simply do the following (for the example case where the first column is selected):

if (options & ON) {}          // true
if (options & OFF) {}         // false
if (options & PHOTOREALISTIC) // true

The idea is to, based upon the selection presented in the bitflag, execute different parts of the shader. For this purpose I need to do something like:

if( options == 273)
  // do enable object, render with no contours and in a photorealistic manner

and skip (in the shader) the rest of the options which are disabled. However, in the ideal case I would like to simplify this to how it is done on the CPU, using the bitflags. So in my shader I would like to have something along the lines of:

if ( (options & PHOTOREALISTIC) & (options & ON)) // true
    // do stuff

Is it possible to achieve something like this. Not the exact same thing maybe but something more elegant than simply " if()-ing " against all possible integers resulting from the bitflag? (like: if(1+16+256), if(1 + 16 + 512),... if(8+128+2048))

bastijn
  • 5,841
  • 5
  • 27
  • 43
  • You can pass integer uniforms to a shader, and perform integer (bitwise) operations in a shader. What is the trouble, exactly? – Asher Dunn Mar 09 '11 at 20:51
  • Uh, that I did not think of such an easy solution as sending my enums as uniforms and thought I was forced to write it as option == value instead of option & (uniform | uniform2) (with uniforms being my enums) =[. – bastijn Mar 10 '11 at 08:38

2 Answers2

3

There will be no point in elegant enumeration of your flags because of the different code paths that you want to use. In C++ you could do something like a jump function table, but in GLSL you can't do that (and believe me - you don't want to). So check in a regular way:

if (options & (ON | PHOTOREALISTIC)) {  /*do something*/ }

As an alternative in case you are not bound to the Uber-shader approach - consider building the GLSL program from separate function blocks implemented in GLSL objects.

For example, one block may implement a regular renderer while the second do it in a photorealistic way. Both should have the same function name, that is linked externally in a main shader and called from there. When linking the GLSL program you attach only one of these blocks that matches desired behavior.

kvark
  • 5,291
  • 2
  • 24
  • 33
  • How do you create separate compile blocks in GLSL? – Mark Ingram Jul 01 '13 at 08:53
  • GLSL program is linked from GLSL objects in the way very similar to C linkage. Basically, you compile a set of objects, attach them to the program, and link it together. Each object will have access to functions of other objects of the same shader stage (be it vertex, fragment, etc). – kvark Nov 14 '13 at 15:06
0

Can you use this?

if ((options & (PHOTOREALISTIC | ON)) == (PHOTOREALISTIC | ON))
Erik
  • 88,732
  • 13
  • 198
  • 189