1

Short storry:

when I render anything using texture loaded like this

glTexImage2D   ( GL_TEXTURE_2D, 0, GL_R8, width,  height, 0, GL_RED,   GL_UNSIGNED_BYTE, pixels );

I get only black


Long storry:

I can get RGBA texture with alpha channel (e.g. text with transparent backgorund using this code):

This code works:

// === load
#define GL_ABGR 0x8000
SDL_Surface * surf = SDL_LoadBMP( "common_resources/dejvu_sans_mono_RGBA.bmp" );
glGenTextures  ( 1, &itex );
glBindTexture  ( GL_TEXTURE_2D, itex );
glTexImage2D   ( GL_TEXTURE_2D, 0, GL_RGBA, surf->w,  surf->h, 0, GL_ABGR,   GL_UNSIGNED_BYTE, surf->pixels );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
// ....
// === render
glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, itex );
glColor3f(1.0f,1.0f,1.0f);
glEnable(GL_BLEND);
glEnable(GL_ALPHA_TEST);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
drawString ( caption,             xmin, ymin+12, 6 );

renders like enter image description here

But I'm trying to use just one channel (8-bit; grayscale) images / textures instead of RGBA. These I cannot get to render neither with nor without transparancy. Whatever I do I get only black image.

This doesn't

// === load
#define GL_ABGR 0x8000
SDL_Surface * surf = SDL_LoadBMP( "common_resources/dejvu_sans_mono_Alpha.bmp" );
glGenTextures  ( 1, &itex );
glBindTexture  ( GL_TEXTURE_2D, itex );
glTexImage2D   ( GL_TEXTURE_2D, 0, GL_R8, surf->w,  surf->h, 0, GL_RED,   GL_UNSIGNED_BYTE, surf->pixels );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
// ....
// === render
glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, itex );
glColor3f(1.0f,1.0f,1.0f);
//glEnable(GL_BLEND);
//glEnable(GL_ALPHA_TEST);
//glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
drawString ( caption,             xmin, ymin+12, 6 );

renders like enter image description here

Notes:

  1. I know that I should somehow use glTexEnv according to e.g. here but my main problem is that apparently the monochrome texture does not render at all
  2. I tried also other GL_LUMINANCE and GL_INTENSITY instead of GL_RED in glTexImage2D with no difference
  3. there are other questions like here and here but mostly with OpenGL>3.0 and fragment shaders

Also, is it possible that my graphics card or driver does not support this ? I'm on ubuntu 16.04

GL_VENDOR:     Intel Open Source Technology Center
GL_RENDERER:   Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 
GL_VERSION:    3.0 Mesa 11.2.0

for completeness - although it is not importaint the drawString looks like this:

drawString ( caption,             xmin, ymin+12, 6 ){
  const int nchars = 95;
  float persprite = 1.0f/nchars;
  glBegin(GL_QUADS);
  for(int i=0; i<65536; i++){
    if( str[i] == 0 ) break; // 0-terminated string
    int isprite = str[i] - 33; // 33 is offset of meaningfull ASCII characters
    float offset  = isprite*persprite+(persprite*0.57);
    float xi = i*sz + x;
    glTexCoord2f( offset          , 1.0f ); glVertex3f( xi,    y,    3.0f );
    glTexCoord2f( offset+persprite, 1.0f ); glVertex3f( xi+sz, y,    3.0f );
    glTexCoord2f( offset+persprite, 0.0f ); glVertex3f( xi+sz, y+sz*2, 3.0f );
    glTexCoord2f( offset          , 0.0f ); glVertex3f( xi,    y+sz*2, 3.0f );
  }
  glEnd();

}

Community
  • 1
  • 1
Prokop Hapala
  • 2,424
  • 2
  • 30
  • 59

1 Answers1

-1

I want to try to help you. In my projects I am using this arguments for generating textures from grayscale source images:

glTexImage2D(GL_TEXTURE_2D, 0, 1, width, height, 0, GL_RED,
             GL_UNSIGNED_BYTE, pixels);

As written in documentation, third argument - number of color components (1 in our case). Need to check integer value of GL_R8 or replace it explicitly.

GL_RED means that you place luminances in red channel (not in each red, green, blue channels as for grayscale image).

Dmitry Kurtaev
  • 823
  • 6
  • 14
  • thanks, I tried `glTexImage2D( GL_TEXTURE_2D, 0, 1, surf->w, surf->h, 0, GL_RED, GL_UNSIGNED_BYTE, surf->pixels );` but the resut is still the same ... maybe really intel GPU does not support it (?) – Prokop Hapala Oct 09 '16 at 13:52
  • @ProkopHapala , are you have command ```glPixelStorei(GL_UNPACK_ALIGNMENT, 1);``` ? I am usally calls ```glBindTexture```->```glPixelStorei```->```x2 glTexParameteri```->```glTexImage2D``` – Dmitry Kurtaev Oct 09 '16 at 13:56
  • 2
    You should never use bare numbers for the internal format of a texture. Always used sized image formats, even for greyscale images. – Nicol Bolas Oct 09 '16 at 14:10
  • now I tried to add `glPixelStorei(GL_UNPACK_ALIGNMENT, 1);` but did not helped – Prokop Hapala Oct 09 '16 at 14:14
  • @ProkopHapala , I think that resulting texture in memory has zeros at alpha channel (we are loading our intensities to red). – Dmitry Kurtaev Oct 09 '16 at 14:19
  • @ProkopHapala, Hello again! Please try to use ```glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, pixels);``` – Dmitry Kurtaev Oct 18 '16 at 15:52