0

First I'd like to mention that I found what I believe to be the exact same question, unfortunately without an answer, here: Java Using OpenGL Stencil to create Outline

I will post my code below, but first here is the problem: from this capture**, you can see that the entire frame structure is showing, instead of a single line around the sphere. I would like to get rid of all those lines inside!

** Apparently I cannot add pictures: see this link - imagine a sphere with all the edges of the quads visible in big 3 pixels large lines.
http://srbwks36224-03.engin.umich.edu/kdi/images/gs_sphere_with_frame.jpg


Here is the code giving that result:

// First render the sphere:
// inside "show" is all the code to display a textured sphere
// looking like earth
sphe->show();

// Now get ready for stencil buffer drawing pass:
// 1. Clear and initialize it
// 2. Activate stencil buffer
// 3. On the first rendering pass, we want to "SUCCEED ALWAYS"
//    and write a "1" into the stencil buffer accordingly
// 4. We don't need to actually render the object, hence disabling RGB mask
glClearStencil(0);   //Edit: swapped this line and below
glClear(GL_STENCIL_BUFFER_BIT);                 
glEnable(GL_STENCIL_TEST);                  
glStencilFunc(GL_NEVER, 0x1, 0x1);          //Edit: GL_ALWAYS
glStencilOp(GL_REPLACE, GL_KEEP, GL_KEEP);  //Edit: GL_KEEP, GL_KEEP, GL_REPLACE                    
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glDepthMask(GL_FALSE);   // As per Andon's comment
sphe->show();

// At this point, I expect to have "1" on the entire
// area covered by the sphere, so...
// 1. Stencil test should fail for anything, but 0 value
// RM: commented is another option that should work too I believe                       
// 2. The stencil op instruction at the point is somewhat irrelevant 
//    (if my understanding is correct), because we won't do anything 
//    else with the stencil buffer after that.
// 3. Re-enable RGB mask, because we want to draw this time  
// 4. Switch to LINE drawing instead of FILL and 
// 5. set a bigger line width, so it will exceed the model boundaries. 
//    We do want this, otherwise the line would not show 
// 6. Don't mind the "uniform" setting instruction, this is so
//    that my shader knows it should draw in plain color
// 7. Draw the sphere's frame
// 8. The principle, as I understand it is that all the lines should
//    find themselves matched to a "1" in the stencil buffer and therefore
//    be ignored for rendering. Only lines on the edges of the model should
//    have half their width not failing the stencil test.
glStencilFunc(GL_EQUAL, 0x0, 0x1);
//glStencilFunc(GL_NOTEQUAL, 0x1, 0x1);
glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glDepthMask(GL_TRUE);
glLineWidth(3);
glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);  
psa::shader::setUniform("outlining", 1);
sphe->show();
psa::shader::setUniform("outlining", 0);

Now just to prove a point, I tired to do something different using the stencil buffer - I just wanted to make sure that everything was in place in my code, for it to work.

** Again I can unfortunately not show a screen capture of the result I get: the scene is like this
http://mathworld.wolfram.com/images/eps-gif/SphereSphereInterGraphic_700.gif
But the smaller sphere is invisible (RGB mask deactivated) and one can see the world background through the hole (instead of the inside of the bigger sphere - face culling is deactivated).

And this is the code... Interestingly, I can change many things like activate/deactivate the STENCIL_TEST, change operation to GL_KEEP everywhere, or even change second stencilFunc to "NOT EQUAL 0"... The result is always the same! I think I am missing something basic here.

void testStencil()
{
    // 1. Write a 1 in the Stencil buffer for 
    // every pixels of the first sphere:
    // All colors disabled, we don't need to see that sphere
    glEnable(GL_STENCIL_TEST);
    glStencilFunc(GL_ALWAYS, 0x1, 0x1);
    glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
    glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
    glDepthMask(GL_FALSE);   // Edit: added this
    {
        sphe->W = mat4::trans(psa::vec4(1.0, 1.0, 1.0)) * mat4::scale(0.9);
        sphe->show();
    }

    // 2. Draw the second sphere with the following rule:
    // fail the stencil test for every pixels with a 1.
    // This means that  any pixel from first sphere will
    // not be draw as part of the second sphere.
    glStencilFunc(GL_EQUAL, 0x0, 0x1);
    glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
    glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
    glDepthMask(GL_TRUE);    // Edit: added this
    {
        sphe->W = mat4::trans(psa::vec4(1.2, 1.2, 1.2)) * mat4::scale(1.1);
        sphe->show();
    }
}

Et voila! If anyone could point me in the right direction I would very much appreciate it. I'll also make sure to refer your answer to this other post I found.

Philippe
  • 21
  • 4
  • This is not related to your problem, but it is a problem on its own... `glClearStencil (0)` sets the value to apply when you call `glClear (GL_STENCIL_BUFFER_BIT)`. You have called these two functions in the wrong order, but it probably does not matter since the default clear value is **0** anyway. – Andon M. Coleman Mar 22 '15 at 20:55
  • One thing I do believe may be related, however, is your depth buffer. You disabled color writes on your stencil pass, but the depth buffer is still written. What you are describing: _"one can see the world background through the hole (instead of the inside of the bigger sphere - face culling is deactivated). "_ appears to be due to depth testing against the sphere that didn't write color. Try disabling depth writes as well as color writes and consider clearing the depth buffer between passes; depth sounds like the culprit, but the order of drawing is not crystal clear to me right now. – Andon M. Coleman Mar 22 '15 at 21:08
  • Thank you Andon, I corrected my mistake on the stencil clearing, but as you suspected this was not the issue. – Philippe Mar 22 '15 at 21:28
  • You had a very good point regarding the depth buffer: I added glDepthMask(GL_FALSE / TRUE) to accompany the glColorMask instructions. It now looks like my "testStencil" function shows literally no effect - it display a whole, unaltered sphere... Is there any chance I have simply no Stencil buffer feature? I do have a "SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 1)" instruction at window creation time. – Philippe Mar 22 '15 at 21:34
  • There's generally no such thing as a 1-bit stencil buffer. I think at minimum, to make that work, you are going to need a value of **8**. GPU-based renderers generally only support 8-bit stencil and they combine it with the depth buffer. So 24-bit depth and 8-bit stencil (which combine to 32-bit) is generally what you need to use. 32-bit depth + 8-bit stencil _is_ supported on DX10 GPUs, but you have to jump through hoops to get it and it wastes 24-bits (32-bit depth + 8-bit stencil + 24-bit unused) ;) – Andon M. Coleman Mar 22 '15 at 22:58
  • Thanks Andon, I've been searching for hours now (happy Sundays!) and found out about these 8 bits for the Stencil size, like you suggest. Unfortunately it still isn't enough to resolve the issue. I was so out of ideas that I ended up migrating my window creation from SDL to glut... The outlining worked immediately - no inner edges. Thus it has to do with my SDL code (tried both 1.2 and 2.0). I will close this question, since the OpenGL code is correct. – Philippe Mar 22 '15 at 23:20

1 Answers1

1

The OpenGL code posted in this question works. The cause of the problem lied in the window initialization/creation:

Below are respectively the SDL1.2 and SDL2 versions of the code that work. Note that in both cases the SetAttribute statements are placed before the window creation. The main problem being that miss-placed statements will not necessarily fail at run-time, but won't work either.

SDL1.2:

if(SDL_Init(SDL_INIT_EVERYTHING) < 0) 
{
    throw "Video initialization failed";
}

SDL_GL_SetAttribute(SDL_GL_RED_SIZE,     5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,   5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,    5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,  24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLEBUFFERS, 1);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLESAMPLES,16);  
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);

const SDL_VideoInfo * i;
if((i = SDL_GetVideoInfo()) == NULL)  
{
    throw "Video query failed";
}

int flag = (fs ? SDL_OPENGL | SDL_FULLSCREEN : SDL_OPENGL);
if(SDL_SetVideoMode(w, h, i->vfmt->BitsPerPixel, flag) == 0)
{
    throw "Video mode set failed";
}

glewExperimental = GL_TRUE;
if(glewInit() != GLEW_OK)
{
    throw "Could not initialize GLEW";
}

if(!glewIsSupported("GL_VERSION_3_3"))
{
    throw "OpenGL 3.3 not supported";
}

SDL2 (same code essentially, just the window creation functions change):

if(SDL_Init(SDL_INIT_EVERYTHING) < 0) 
{
    throw "Video initialization failed";
}

SDL_GL_SetAttribute(SDL_GL_RED_SIZE,     5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,   5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,    5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,  24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLEBUFFERS, 1);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLESAMPLES,16);  
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);

int flag = SDL_WINDOW_OPENGL;
if((win = SDL_CreateWindow("engine", 100, 100, w, h, flag)) == NULL)
{
    throw "Create SDL Window failed";
}

context = SDL_GL_CreateContext(win);

glewExperimental = GL_TRUE;
if(glewInit() != GLEW_OK)
{
    throw "Could not initialize GLEW";
}

if(!glewIsSupported("GL_VERSION_3_3"))
{
    throw "OpenGL 3.3 not supported";
}

A few points worth mentioning:
1. Even-though I understand it is not advised, SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 1) works too
2. In SDL2, SDL_GL_SetAttribute SDL_GL_MULTISAMPLESAMPLES goes up to 16 for me, after that glewInit() fails, BUT, move the multisampling setting after the window creation and suddenly glewInit stops complaining: my guess is that it is just being ignored.
3. In SDL1.2, any value of Multisampling "seem" to work
4. Considering the stencil buffer feature only the code below works too, but I post it essentially to raise the question: how many of the Attribute settings are actually working? And how to know, since the code compiles and runs without apparent problem?

// PROBABLY WRONG:
// ----
SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);

const SDL_VideoInfo * i;
if((i = SDL_GetVideoInfo()) == NULL)  
{
        throw "Video query failed";
}

int flag = (fs ? SDL_OPENGL | SDL_FULLSCREEN : SDL_OPENGL);
if(SDL_SetVideoMode(w, h, i->vfmt->BitsPerPixel, flag) == 0)
{
    throw "Video mode set failed";
}

// No idea if the below is actually applied!
SDL_GL_SetAttribute(SDL_GL_RED_SIZE,     5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,   5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,    5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,  24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLEBUFFERS, 1);
SDL_GL_SetAttribute(SDL_GL_MULTISAMPLESAMPLES, 16); 
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
Philippe
  • 21
  • 4
  • That's not really a quality answer if I'm being honest. You should probably consider editing your question to show the initialization you tried with SDL. We would be able to help you sort _that_ problem out and let you arrive at a real solution that doesn't involve using a deprecated framework like GLUT. – Andon M. Coleman Mar 23 '15 at 19:52
  • I just thought that as far as the subject goes "outlining 3D model with stencil buffer", the answer is there. It felt to me, that shifting into SDL and/or glut considerations was beside the point. In any case, I will post my findings in a moment. – Philippe Mar 24 '15 at 18:09