0

I am trying to load a 2D Texture from an (currently) hardcoded byte array using OpenGL and display it in an ImGui window.

I was using DirectX before and loaded the texture using D3DXCreateTextureFromFileInMemoryEx which worked completely fine. However, I am currently working on a different application which requires me to use OpenGL instead and I seem to make a mistake whilst loading the texture.

I have already looked at a few other threads that had a similiar issue, however, with no success. It always drew just a black rectangle and I honestly have no clue what I'm doing wrong.

I currently have a hardcoded BYTE[], the same one I've used in my previous application and that displayed just fine.

BYTE Image[/*616*/] =
{
    0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A, 0x00, 0x00,
    0x00, 0x0D, 0x49, 0x48, 0x44, 0x52, 0x00, 0x00, 0x00, 0x14,
    0x00, 0x00, 0x00, 0x14, 0x08, 0x06, 0x00, 0x00, 0x00, 0x8D,
    0x89, 0x1D, 0x0D, 0x00, 0x00, 0x02, 0x2F, 0x49, 0x44, 0x41,
    0x54, 0x38, 0x8D, 0x8D, 0x95, 0x4B, 0x48, 0x55, 0x51, 0x14,
    0x86, 0xBF, 0x6B, 0x66, 0x37, 0x35, 0x7B, 0x3F, 0x84, 0x9A,
    ...
};

This is the code I use in order to load the texture:

void sglLoadTexture(BYTE* texture, UINT width, UINT height, GLuint* out_texture)
{   
    char* image_data = new char[sizeof(texture)];  // copy-paste from some forum thread 

    // Create a OpenGL texture identifier
    GLuint image_texture;
    glGenTextures(1, &image_texture);
    glBindTexture(GL_TEXTURE_2D, image_texture);

    // Setup filtering parameters for display
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP); 
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP); 
    
    // Upload pixels into texture
#if defined(GL_UNPACK_ROW_LENGTH) && !defined(__EMSCRIPTEN__)
    glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
#endif
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_INT, image_data);
    *out_texture = image_texture;
}

The calling context:

// decleration of the array
GLuint* icons[8] = { nullptr,nullptr,nullptr,nullptr,nullptr,nullptr,nullptr,nullptr };

GLuint texture;
sglLoadTexture(Image /*The byte array from before*/, 20, 20, &texture);
icons[GLTEXTURE_IMAGE /*just a variable that holds the value 0*/] = &texture;

And the code where I'm trying to display the loaded texture using ImGui:

ImGui::Image(icons[GLTEXTURE_IMAGE], ImVec2(16, 16));

However, using this code draws a black, 16x16 rectangle and I don't know where I went wrong. I'm not really experienced with OpenGL so it might just be a really dumb error in my code.

  • Pass `texture` directly to ImGui, not a pointer to it. If ImGui insists a pointer, cast it to one instead of taking the address. – HolyBlackCat Jun 04 '22 at 09:33
  • Thanks, atleast I got a texture now instead of a black rectangle. However, the outcome is really weird. It doesn't look like what I expected at all. It should've looked like a white star but it is out of shape and has weird colors. It's a little small but, here's a [Screenshot](https://prnt.sc/Jvsmq0eVD7sT) , as you can probably tell, this doesn't really look like what i described. – SteveOberst Jun 04 '22 at 11:23
  • `glTexImage2D` should probably use `GL_UNSIGNED_BYTE`. – HolyBlackCat Jun 04 '22 at 11:26
  • Thanks, I have changed it to `GL_UNSIGNED_BYTE` however, the outcome is different but still glitchy. I have also tried playing around with the `format` and `internalFormat`, changing them to `GL_RGB` instead of `GL_RGBA` and other possible values. However, the [result](https://prnt.sc/VdIiV8GXXMuV) became even more strange... – SteveOberst Jun 04 '22 at 13:11

0 Answers0