I have a .net/DirectX-based rendering library. At program start, we try to find out the amount of physical video RAM. Detecting the amount of physical video memory is important, to know how much can be used for textures (If you've done serious directX you probably already know).
Video RAM is detected by executing the following WMI code:
var searcher = new ManagementObjectSearcher("Select * from Win32_VideoController");
foreach(ManagementObject videoCard in searcher.Get())
{
_numVideoCards++;
foreach (PropertyData propertyData in videoCard.Properties)
{
if (propertyData.Name == "AdapterRAM" && propertyData.Value != null)
{
_adapterRAM = Math.Max( (UInt32)(propertyData.Value), _adapterRAM );
}
}
}
This code was written years ago by people who knew directX better than me.
The issue is that this call is now failing unpredictably on customer hardware (_adapterRAM == 0 after the code completes, and an exception is logged).
I would like to change the test, but I hesitate because I expect there is a reason why the video memory is detected this way, and directly through DirectX.
My question is twofold:
- Does anyone know why you'd do this with WMI, and/or
- do you know a more reiliable way to detect the amount of physical video RAM?
P.S.: We're not interested in shared memory video cards (e.g. Intel). We use SlimDX if that makes a difference.