1

This might sound like a weird question, but here's the story:

There is a small but stubborn community which still plays an older 2005-6 game (Age of Empires III + expansions). Some play it on Windows and some others play on OS X (the game was also ported on OS X).

The problem is when the game was launched on OS X, everyone could play on medium and high shaders settings, it seems that the graphics cards were properly recognised at that time. However, once the newer models got Intel graphics chips (Intel HD 4000, Iris, etc) the medium and higher shaders settings stopped working.

The same game now can display medium-high shaders properly on Windows, but fails to do so in OS X on the same machine (using dual boot). So, it's not a hardware problem. The same graphics chip can display high shaders on Windows (in a DirectX environment), but fails to do so in OS X (in an OpenGL environment).

I first thought this is because the company which ported this game (Macsoft) did not adapt the shaders programming for OpenGL, but apparently they did. And since high shaders once worked in OS X, this means their OpenGL adaptation worked right with different hardware. So, it seems this is a problem with how the game makers wrote their configuration files to recognise hardware devices, somehow the newer Intel graphic chips are not recognised as DirectX 8 / 9 Shader model v 2.0 capable, and rendering defaults to basic DirectX 7 capabilities without any shaders.

I did a bit of research in the game files to try to track down which is the file which makes the game recognise graphic chips and I found out it's devices.xml from this folder (I uploaded all the shaders files in a googledrive, in case anyone wants to take a look):

/Config/ folder

I discovered that the game ignores everything in the devices.xml file and jumps to the last generic device with the lowest shader settings:

    <device name="GenericFF">
        generic dx7
        <low>Generic DX7</low>
    </device>

If I replace this device with a Generic DX8 setting, the game starts displaying more shaders but water rendering gets some weird colour ranges from green to purple.

This Generic DX7 profile is also coded in a separate file called Generic DX7.d3dconfig, which you can find in the same /Config/ folder I mentioned before. In this file you can find a bunch of settings such as:

<d3dconfig>
    <fixedfunctionverts>true</fixedfunctionverts>
    <techniques>
        <enable>alpha transparency</enable>
        <enable>alphatest</enable>
        <enable>alphatest_flatcolor</enable>
        <enable>alphatest_blendcolor</enable>

Each of these shader capabilities have their own settings file, for example alphatest (in the /techniques/ folder):

<technique>
    <pass>
        <texture stage="0">diffuse</texture>
        <rs>alphatest</rs>
        <tss>default</tss>
    </pass>
</technique>

And, of course, finally all these configuration files which are coded in format similar with XML point to shader programming files such as default_shader.psh in the /Render/ps/ folder:

ps.1.1

def c7, 1, 0, 0, 0


tex t0   //base texture
#ifdef ENVMAP
tex t2   //envmap texture
#endif

mul_x2 r0.rgb, t0, v0       // texture * light
+mul r0.a, t0, v0       // texture * light

#ifdef ENVMAP
// reflectivity is a float value in the r component so the dp3 is a
// clunky way to propogate this value into all components before the lerp
dp3 r1, c6, c7
lrp r0, r1, t2, r0
#endif

#ifdef TREE
add r0, r0, c1
#endif

#ifdef TINT
add r0, r0, c0
#endif

The question is: Is it possible to make this old game (which uses older shader programming, that supported up to Shader model 2.0) recognise newer Intel graphics chips, or is this a problem with how the game on OS X was ported for an OpenGL environment?

UPDATE:

There is another folder which seems unused by the game and in which a file with the same name as the one I mentioned above has code such as:

// default_shader.psh

#ifdef TREE
uniform vec4 pc1;
#endif
#ifdef TINT
uniform vec4 pc0;
#endif

uniform sampler2D tx0;

void main()
{
    vec4 diffuse = vec4( texture2D(tx0, gl_TexCoord[0].st) ) * gl_Color;
    diffuse.xyz *= 2.0;

#ifdef TREE
    diffuse += pc1;
#endif

#ifdef TINT
    diffuse += pc0;
#endif

    gl_FragColor = diffuse;
} 

/*
ps.1.1

def c7, 1, 0, 0, 0


tex t0   //base texture
#ifdef ENVMAP
tex t2   //envmap texture
#endif

mul_x2 r0.rgb, t0, v0       // texture * light
+mul r0.a, t0, v0       // texture * light

#ifdef ENVMAP
// reflectivity is a float value in the r component so the dp3 is a
// clunky way to propogate this value into all components before the lerp
dp3 r1, c6, c7
lrp r0, r1, t2, r0
#endif

#ifdef TREE
add r0, r0, c1
#endif

#ifdef TINT
add r0, r0, c0
#endif
*/
dolanator
  • 180
  • 3
  • 13
  • 1
    that is not an openGL shader, you'll need to port that assembly to GLSL first – ratchet freak Dec 11 '14 at 14:21
  • Well, there is another folder which I didn't upload because it seems unused by the game, and here there's a file with the same name (default_shader.psh) which has this code -- can't post here, too many characters, I'll post a bigger comment. Posted an update with the code. I'm not a shader programmer, but that looks like it's their OpenGL code, right? My programming knowledge is some basic C++ and some scripting languages. – dolanator Dec 11 '14 at 14:28
  • That's a simpler fragment shader with #IFDEFs; you'd need to preprocess them and remove the them plus the code inbetween based on what the program wants – ratchet freak Dec 11 '14 at 14:34
  • The thing is the same code worked on older hardware, so for example OS X machines with nVidia and ATI cards (from about 2006-7) which were targeted by this code displayed high shaders with no problem. But newer Intel chips fail to get identified. Is this a problem with the shader programming, the OpenGL port they did or is it maybe just a simple setting, like writing a new device profile which points the game to identify new Intel hardware? – dolanator Dec 11 '14 at 14:51
  • @Mister are you sure that these are the shaders that got passed to OpenGL? Some ports used runtime converters for hlsl to glsl, so maybe use something apitrace to capture the exact strings passed to glShaderSource – PeterT Dec 11 '14 at 14:57
  • @ratchetfreak GLSL supports preprocessing directives. – Reto Koradi Dec 11 '14 at 15:05
  • @Mister Oh, and if it's the game using the wrong shaders, why not just do the obvious thing and add your device ID ([which you can get here](http://en.wikipedia.org/wiki/List_of_Intel_graphics_processing_units)) under `` just like the existing entries there `2582` and `2592` – PeterT Dec 11 '14 at 15:08
  • @PeterT I thought about that, but the game seems to ignore those entries entirely and default on a generic DirectX 7 or whatever OpenGL equivalent was ported over to OS X (I imagine it's not the same, but I'm not a shaders programmer so I don't know the differences). I'll try adding an ID and see if it works, though. – dolanator Dec 11 '14 at 15:11
  • @Mister: What version of OS X and GPU are you talking here? In 10.7, some of the older Intel GPUs (GMA xxx) do something peculiar and only support OpenGL 1.4 (though offer almost the entire 2.1 feature set in the form of extensions). 10.8 dropped support for those GPUs altogether, so if you're talking HD 3000+ class GPUs that's probably not the issue - however, there are still quite a few extensions that AMD and NV GPUs support on OS X that even the latest Intel GPUs don't. – Andon M. Coleman Dec 11 '14 at 15:17
  • @AndonM.Coleman, I tested on a 10.8.5 and on 10.10 with HD Graphics 4000. – dolanator Dec 11 '14 at 15:21
  • @PeterT I added the device ID under the Intel9 entry and the game ignores it. Thanks for the comments everyone. It might be interesting to understand why this happens, just for everyone's experience, but maybe the problem also gets solved, in the process. – dolanator Dec 11 '14 at 15:22
  • What is interesting is if I modify that generic entry which was set to 'generic DX7' and replace it with 'generic DX8', the game starts displaying water differently: it looks like a sea of plastic with colours ranging from green to purple. But if I replace this entry with 'generic DX9' most of the terrain fails to get drawn. It's like these rendering capabilities stack each on top of the other, starting from the most basic DX7 to highest (shader2). But the problem is I haven't been able to make the game recognise medium and high shader settings, all I could do is change the baseline. – dolanator Dec 11 '14 at 15:53

1 Answers1

0

My conclusion is that this is most likely impossible, because a great part of the OpenGL porting involved separate shaders programming for each proprietary device.

For example, in the /shaders/ folder there are separate files for terrain shadow rendering for each major manufacturer which provided a graphics card for machines which ran OS X (ATI and nVidia): terrain shadow only_ati.vsh and terrain shadow only_nv.psh. These have different methods for rendering the same thing:

// terrain shadow only_ati.vsh

uniform mat4 vc0;   // worldViewProj
uniform mat4 vc4;
uniform mat4 vc8;
uniform vec4 vc12;

void main()
{
    // transform position
    gl_Position = gl_Vertex * vc0;

    // Set to constant color
    gl_FrontColor = vec4(0.0);

    // transform
    gl_TexCoord[0] = gl_Vertex * vc4;

    // Depth in light space
    vec4 r1 = gl_Vertex * vc8;
    gl_TexCoord[1].s = r1.z * vc12.x;
}

and

// terrain shadow only_nv.vsh

uniform mat4 vc0;   // worldViewProj
uniform mat4 vc4;
uniform vec4 vc15;  // texture filtering offsets
uniform vec4 vc16;
uniform vec4 vc17;

void main()
{
    // transform position
    gl_Position = gl_Vertex * vc0;

    // Set to constant color
    gl_FrontColor = vec4(0.0, 0.0, 0.0, 0.0);

    // transform
    vec4 r0 = gl_Vertex * vc4;
    r0.y = 1.0 - r0.y;      // Invert y (reading from a GL buffer)

    //-- compute PCF Filter texture offsets
    gl_TexCoord[0] = r0 + vc15;
    gl_TexCoord[2] = r0 + vc16;
    gl_TexCoord[3] = r0 + vc17;
}

Since there's no such OpenGL porting for Intel chips, this means there probably is no support for medium and higher shadings on the OpenGL part of this game.

dolanator
  • 180
  • 3
  • 13