-1

When I try to render a PERFECT cube, one vertex renders in the middle of the cube - (.5, .5, -.5) instead of (1, 1, 1) - and one renders one unit too high - (1, 1, 1) instead of (1, 0, 0). I really don't have a single clue how that could happen. I first checked if it had to do with the obj loading and defined the arrays separately, it didn't. This is the output of the program. This is from another perspective for better understanding of the problem.

First I initialize glfw, create a window, make a context and set various variables:

if (!glfwInit())
    {
        std::cout << "Failed to initialize GLFW, press enter to close the application..." << std::endl;
        std::cin.get();
        return false;
    }
    GLFWWindow *window= glfwCreateWindow(m_Width, m_Height, m_Title, NULL, NULL);

    if (!window)
    {
        glfwTerminate();
        std::cout << "Failed to creat GLFW window, press enter to close the application..." << std::endl;
        std::cin.get();
        return false;
    }
    glfwMakeContextCurrent(m_GLFWWindow);

Then I initialize some enable some gl options and set various other variables:

glClearColor(0, 0, 0, 0);

    glEnable(GL_CULL_FACE);
    glCullFace(GL_BACK);
    glFrontFace(GL_CW);
    glEnable(GL_DEPTH_TEST);
    glDepthRange(0, 1);
    glEnable(GL_FRAMEBUFFER_SRGB);

After that is done I declare a Vertex array and a indices array and pass them to OpenGL: (Vertex is just a structure containing three floats)

Vertex verticies[] = {  Vertex(Vector3(  1, -1, -1)),
                            Vertex(Vector3(  1, -1,  1)),
                            Vertex(Vector3( -1, -1,  1)),
                            Vertex(Vector3( -1, -1, -1)),
                            Vertex(Vector3(  1,  1, -1)),
                            Vertex(Vector3(  1,  1,  1)),
                            Vertex(Vector3( -1,  1,  1)),
                            Vertex(Vector3( -1,  1, -1)) };

    unsigned int indicies[] = { 2, 4, 1,
                                8, 6, 5,
                                5, 2, 1,
                                6, 3, 2,
                                3, 8, 4,
                                1, 8, 5,
                                2, 3, 4,
                                8, 7, 6,
                                5, 6, 2,
                                6, 7, 3,
                                3, 7, 8,
                                1, 4, 8};

GLu vbo;
Glu ibo;
glGenBuffers(1, &vbi);
glGenBuffers(1, &ino);
glBindBuffer(GL_ARRAY_BUFFER, m_Vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(verticies[0]) * 8, &verticies, GL_STATIC_DRAW);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_Ibo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indicies[0]) * 36, &indicies, GL_STATIC_DRAW);

Then I create a program, compile shaders and link them to the program:

GLuint program = glCreateProgram();
GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
GLchar vertexCode = "vertexShader code";
GLchar fragmentCode = "fragmenShader code";
glShaderSource(vertexShader , 1, &vertexCode, NULL);
glCompileShader(vertexShader );
glShaderSource(fragmentShader, 1, &fragmentCode, NULL);
glCompileShader(fragmentShader);
glAttachShader(program , vertexShader );
glAttachShader(program , fragmentShader);
glLinkProgram(m_Program);
glValidateProgram(m_Program);

Now we're at the game loop:

while(1)
{
   glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

   glUseProgram(program);
   glEnableVertexAttribArray(0);

    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
    glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
    glDisableVertexAttribArray(0);
}
Toby Speight
  • 27,591
  • 48
  • 66
  • 103

2 Answers2

1

Your indices are off by one, they are 0-based in OpenGL, so only the values 0 to 7 are valid for your array, and using 8 will access some out-of-bounds memory with undefined contents.

derhass
  • 43,833
  • 2
  • 57
  • 78
  • that is incorrect. glBufferData wants the size in bytes. so 3 * 8. – Sean O'Hanlon Aug 23 '16 at 16:22
  • 1
    @SeanO'Hanlon That's not what they're saying. They're saying the same thing as in my answer, but worded differently. – Xirema Aug 23 '16 at 16:23
  • 1
    @SeanO'Hanlon: I (and @Xirema) never talked about `glBufferData`. We talked about the actual values in your `indicies` array. – derhass Aug 23 '16 at 16:27
  • @SeanO'Hanlon The numbers in your `indicies` array must correspond to elements in the `verticies` array. Elements are 0-based. – isanae Aug 23 '16 at 16:52
  • @derhass i am really confused by what you guys mean. – Sean O'Hanlon Aug 23 '16 at 16:52
  • @SeanO'Hanlon Indexes in OpenGL are exactly that: Indexes. If you want to access the eighth element of your Vertex Array, you need to access it with Index 7, the same as you would a normal array. In your index array, all your accesses seem to be presuming that the first vertex is index 1, the second vertex is index 2, and so on. That is incorrect. If you take each number in `indicies` and subtract 1 from all of them, it should fix your problem. – Xirema Aug 23 '16 at 16:52
  • @SeanO'Hanlon In other words, your `indicies` array should be defined as `unsigned int indicies[] = { 1, 3, 0, 7, 5, 4, 4, 1, 0, 5, 2, 1, 2, 7, 3, 0, 7, 4, 1, 2, 3, 7, 6, 5, 4, 5, 1, 5, 6, 2, 2, 6, 7, 0, 3, 7};` – Xirema Aug 23 '16 at 16:55
  • that is really wierd. i learnt that opengl takes them from 1-size. and the obj i am working with (exported from blender) also starts at 1. but this fixed it thanks. – Sean O'Hanlon Aug 23 '16 at 17:02
1

Your index array doesn't look correct. It should be indexed in the range [0...7] instead of [1...8].

Xirema
  • 19,889
  • 4
  • 32
  • 68