10

There are 3 backgrounds in the below image: black white & grey

There are 3 bars on each one: black -> transparent, white -> transparent, and colors -> transparent

I am using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); and all my vertex colors are 1,1,1,0.

The defect is really visible in the white->transparent on white background.

On Windows XP (and other windows flavors), it works perfectly, and I get fully white. On the Mac however, I get grey in the middle!

What would cause this, and why would it get darker when I'm blending white on white?

Screenshot full size is @ http://dl.dropbox.com/u/9410632/mac-colorbad.png

Screenshot

Updated info:

On Windows, it doesnt seem to matter about opengl version. 2.0 to 3.2, all work. On the Mac I have in front of me now, it's 2.1.

The gradients were held in textures, and all the vertexes are colored 1,1,1,1 (white rgb, full alpha). The backgrounds are just 1x1 pixel textures (atlased with the gradients) and the vertexes are colored as needed, with full alpha.

The atlas is created with glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, data); It comes from a ARGB dds file that I composed myself.

I should also note that everything is drawn using a trivally simple shader:

uniform sampler2D tex1;
uniform float alpha;

void main() {
    gl_FragColor = gl_Color * texture2D(tex1, gl_TexCoord[0].st) * vec4(1.0, 1.0, 1.0, alpha);
}

the alpha uniform is set to 1.0

Now, I did try to change it so the white gradient was not a texture, but just 4 vertexes where the left ones were solid white and opaque, and the right ones were 1,1,1,0, and that worked!

I have triple checked the texture now, and it is only white, with varying alpha 1.0->0.0.

I'm thinking this may be a defaults issue. The version of opengl or the driver may initialize things differently.

For example, I recently found that everyone has GL_TEXTURE_2D glEnabled by default, but not the Intel GME965.

SOLUTION FOUND

First, a bit more background. This program is actually written in .NET (using Mono on OS X), and the DDS file I'm writing is an atlas automatically generated by compacting a directory of 24 bit PNG files into the smallest texture it can. I am loading those PNGs using System.Drawing.Bitmap and rendering them into a larger Bitmap after determining the layout. That post-layout Bitmap is then locked (to get it's at its bytes), and those are written out to a DDS by code I wrote.

Upon reading Bahbar's advise, I checked out the textures in memory and they were indeed different! My DDS loaded seems to be the culprit and not any OpenGL settings. On a hunch today, I checked out the DDS file itself on the two platforms (using a byte for byte comparison), and indeed, they were different! When I load up the DDS files using WTV ( http://developer.nvidia.com/object/windows_texture_viewer.html ), they looked identical. However, using WTV, you can turn off each channel (R G B A). When I toggled off the Alpha channel, on windows I saw a really bad looking image. No alpha would lead to no antialiased edges, so of course it would look horrible. When I turned off the alpha channel on the OSX DDS, it looked fine!

The PNG loader in Mono is premultiplying, causing all my issues. I entered a ticket for them ( https://bugzilla.novell.com/show_bug.cgi?id=679242 ) and have switched to directly using libpng.

Thanks everyone!

Danny Dulai
  • 1,657
  • 12
  • 14
  • what is the rendering surface format ? esp. regarding SRGB ? The white on white has me stumped though. – Bahbar Feb 11 '11 at 08:24
  • 1
    Are the bars drawn using textures, or using color interpolation? – rotoglup Feb 11 '11 at 08:36
  • which vertices are 1,1,1,0? even the black,white,grey backgrounds? – Shezan Baig Feb 11 '11 at 13:49
  • Always include openGL version. It could be as simple as that there are different openGL versions runinng. Use something like this: std::cout << glGetString(GL_VERSION) << std::endl << std::endl; to query your version. As openGL is part of drivers, I doubt that there is same version running and many things could change. – Raven Feb 11 '11 at 17:04
  • updated question with more info, including a success when using color interpolation – Danny Dulai Feb 11 '11 at 19:39
  • Congratulations on finding the problem! – John Bartholomew Mar 13 '11 at 16:07

2 Answers2

2

This is a bit of a stab in the dark, but check (or explicitly set) the pixel transfer modes.

The output you're getting looks like the results you'd expect if you were using textures with pre-multiplied alpha but then using the blend mode you've set. Something might have set up the pixel transfer modes to multiply alpha into the colour channels when textures are uploaded.

You could also check if the Mac result is correct (with textures) when you set the blend mode to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA).

John Bartholomew
  • 6,428
  • 1
  • 30
  • 39
  • It's an uncompressed DDS, and I'm loading it using: `glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, data);` I'm not premultiplying it.. I just read the DDS file from disk and pass the non-header part to the above call. What OpenGL calls would impact how the load is possibly premultiplying? – Danny Dulai Feb 16 '11 at 20:04
  • I printed out the value of GL_*_SCALE and GL_*_BIAS and they are both the same on windows and mac (1 for the scales, 0 for the bias), so that's not it. When I use the GL_ONE blendmode instead of GL_SRC_ALPHA, I see the white correctly, but then all my darkenings overlays break. – Danny Dulai Feb 16 '11 at 20:58
  • There is (or may be, depending on opengl version and extensions) a color matrix applied during pixel transfer. Try clearing it with `glMatrixMode(GL_COLOR); glLoadIdentity();` – John Bartholomew Feb 16 '11 at 23:39
  • Argh, that did not help :-( Is there a way to confirm that the contents of the contents of the texture after glTexImage2D are actually wrong? Can I read pixels and dump them somehow? (I've never read from a texture before). This will allow me to confirm that this is indeed the issue. – Danny Dulai Feb 17 '11 at 04:16
  • @Danny Dulay: you can read the data with glGetTexImage. http://www.manpagez.com/man/3/glGetTexImage/ – Bahbar Feb 17 '11 at 09:59
1

Check your dds loader. It might be doing the pre-multiplication that John Bartholomew was talking about on load, only on one platform.

An easy way to verify is also to look at the data as it is being loaded into the texture, on the glTexImage call. Is that data completely uniform ?

Bahbar
  • 17,760
  • 43
  • 62