-1

I know how to generate a plane and map a texture. Now, I am trying to display an alpha-blended PNG on my form, like it was a sprite.

I have come up with the following code from Googling and guessing:

//  Get the OpenGL object.
var gl = openGLControl.OpenGL;

//  We need to load the texture from file.
var textureImage = Resources.Resource1.bg;

//  A bit of extra initialisation here, we have to enable textures.
gl.Enable(OpenGL.GL_TEXTURE_2D);

//  Get one texture id, and stick it into the textures array.
gl.GenTextures(1, textures);    

//  Bind the texture.
gl.BindTexture(OpenGL.GL_TEXTURE_2D, textures[0]);

gl.Enable(OpenGL.GL_BLEND);
gl.BlendFunc(OpenGL.GL_SRC_ALPHA, OpenGL.GL_DST_ALPHA);

var locked = textureImage.LockBits(
    new Rectangle(0, 0, textureImage.Width, textureImage.Height),
    System.Drawing.Imaging.ImageLockMode.ReadOnly,
    System.Drawing.Imaging.PixelFormat.Format32bppArgb
);

gl.TexImage2D(
    OpenGL.GL_TEXTURE_2D,
    0,
    4,
    textureImage.Width,
    textureImage.Height,
    0,
    OpenGL.GL_RGBA,
    OpenGL.GL_UNSIGNED_BYTE,
    locked.Scan0
);
gl.TexParameter(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_WRAP_S,     OpenGL.GL_CLAMP);
gl.TexParameter(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_WRAP_T,     OpenGL.GL_CLAMP);
gl.TexParameter(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_MAG_FILTER, OpenGL.GL_LINEAR);
gl.TexParameter(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_MIN_FILTER, OpenGL.GL_LINEAR);

Here is the original source image that I am using:

Here is the output when I render this sprite 10 × 10 (100) times on my form:

The output is all messed up, and it doesn't seem to be respecting the image's alpha channel. What changes need to be made to my code to ensure that it renders correctly?

Cody Gray - on strike
  • 239,200
  • 50
  • 490
  • 574
CyberFox
  • 780
  • 6
  • 24

1 Answers1

1

Your question is very vague. I think that what your looking for is changing the blend function to glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

This would enable both semi and fully transparent texture rendering.

Please note that this will not work together with depth buffering.

SporreKing
  • 483
  • 3
  • 14
  • I'm truly sorry for coming off as vague, i'm new to the the world of OGL and my terminology is not quite fluent. Anyways I'm porting a 2D game from GDI+ to OpenGL. When i say sprite i simple mean nothing more than a picture on screen. I understand that is achieved by drawing a Quad with Texture and UV map. This works but looks... strange. The alpha is right but the color has gone apeshit, perhaps some channels are being ignoredor wat do i know. I'll upload a picture tomorrow when I'm back in the office. – CyberFox Mar 08 '16 at 15:58
  • OpenGL expects the format to be RGBA. Make sure the PNG data is specified in 32bit color and that the png decoder does not return the data in any other format such as ARGB (I know that java's ImageIO does this). I'll gladly see the pictures when you have time to post them. – SporreKing Mar 08 '16 at 16:09
  • And also, ogl expects the texture data to be specified in little endian if I remember it correctly. – SporreKing Mar 08 '16 at 16:40
  • I have added an image now. I'd mention that i have tried both GL_RGBA and GL_BGRA and i have even proccessed the image to move the bytes around for turing ARGB -> RGBA to no avail. I did find that zeroing "i+1" and "i+2" 2 channels on the image the output remains the same. Could i add your skype by any chance? – CyberFox Mar 09 '16 at 03:14
  • According to the pictures the problem most certainly lies in wrong byte order (either endianess or color format). Make sure that the byte order you use for OGL is specified in little endian. RGBA in big endian would have to be sent in as ABGR for little endian etc. If that does not work you can add me on skype, same name as here. Note that I'm only familliar with ogl in general and not specifically GDI+. – SporreKing Mar 09 '16 at 07:50
  • i tried with the unsafe context to shuffle around with the bits, but the results were fishy, I've added you on Skype for further. – CyberFox Mar 09 '16 at 08:46
  • I will be home later this evening, I will go to skype then. Btw, I noticed that you have not fully entered correct arguments for `void glTexImage2D( GLenum target, GLint level, GLint internalFormat, GLsizei width, GLsizei height, GLint border, GLenum format, GLenum type, const GLvoid * data);` The internalFormat expects a format such as GL_RGBA, in your code you have used the number 4. Use a format constant. – SporreKing Mar 09 '16 at 09:24