1

Direct3D implementation on i5-1135G7 2.4GHz 8GB Ram with Iris Xe. OpenGL may have similar issues. I'm hoping someone has ideas on most common issues to work around dealing with new Intel Iris Xe Graphics. Our app enumerate the adapters, devices, outputs. First issue sort of resolved, is that enumerating HARDWARE and WARP capabilities seems to get stuck as if next time I try to create the real HARDWARE device, it returns 0 for the Output as if I was requesting Outputs for a WARP type. Thus SwapChain was not creating. If I only enumerate the HARDWARE type capabilites, I get passed this issue. However I get another crash, with a valid Output and SwapChain, pSwapChain-Present returns OK but is hanging and TDR kicks in and removes the device but no error or debug layer strings that the device is removed. I am noticing a bunch for weird first chance c++ exception monza::ddithreadingcontext and they seem related to releasing SAFE_RELEASE(pAdapter) pd3dDevive, pd3dDeviceContext as part of the code that enumerates all the adapter capabilities. Any additional info that may help me or others is appreciated.

Update - Below is fix, and the gigasoft website now has PE9DEMO.EXE updated with this fix (17M simple self extracting signed exe, we call it the "canned demo"). However I'm nervous about Iris Xe Max, Iris Xe Pod, desktop Iris Xe Graphics, and Iris Plus, and any graphics really just to get more feedback. So if anyone has these available, I'd really appreciate getting PE9DEMO tested with these graphics units, examples 400+, the 3D Scientific Chart examples, making sure nothing seems delayed and/or Microsoft Basic Rendering not doing the work.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Robert
  • 658
  • 7
  • 11

1 Answers1

2

After 2 days of brutal random behavior debugging, I worked around all Intel Iris Xe limitations and have projects running nicely on Iris Xe graphics. Which I have to say, I bought this refurbished Acer test laptop for $499 and have been coding on it with an ultra wide monitor and it's very snappy. Granted the Iris Xe driver is a bit buggy related to games and intensive graphics code. It is a great slim light weight general use laptop. I believe the Iris Xe technology will thrive if they can get their drivers fixed.

The major issues were: each issue being random with no way to easily debug.

  1. enumerating the driver capabilities for WARP type was somehow corrupting the final CreateDevice call which resulted in not acquiring an Output interface for the monitor. Thus the SwapChain was not creating. Solution, no longer worry about WARP types and stick to enumerating HARDWARE types only.

  2. the driver was not providing accurate pd3dDevice->CheckMultisampleQualityLevels (it reported it supported 1,2,4,8 counts when it did not) so I had to cherry pick this driver description string and revert sample Count = 1 and Quality = 0, thus no MSAA and jagged images. Any other Count or Quality would crash . After hours of banging, I went back to an early DirectX tutorial code and tested and it worked and started figuring out what was different. I tried everything to get CheckMultisampleQuality to work but nothing helped. I had to cherry pick this driver in my logic.

  3. the compute shader supported ShaderModel 5, but it did not support double precision shader storage (which shader model 5 generally supports) so I had to change the shader and non-shader logic to use floats instead of doubles. I was hoping to rely on inherent double shader storage as shader model 5 should have by now.

But all is running now and hopefully this helps anyone reading. Intel has no excuse to not properly implement CheckMultiSampleQualityLevel. The double precision shader storage issue may be a hardware issue and never supported but we can hope. Who knows about issue 1, it seems odd, maybe my code but I don't believe so.

Note, the March 2021 driver caused my debugger to hang for 3-4 seconds after each step in code. So I had to revert to 08/2020 driver to make it easier to debug with no hanging code-steps. So the 08/2020 driver was more useful to me.

Note, the first chance exceptions c++ exception monza::ddithreadingcontext issue was mute. The early DirectX tutorials also had this but everything seems stable to ignore them.

Happy coding, Robert.

Robert
  • 658
  • 7
  • 11
  • 1
    Intel has CheckMultiSampleQualityLevel bug since at least 2017: https://stackoverflow.com/a/43437289/126995 Reported to Intel too, they did nothing. – Soonts Jun 30 '21 at 07:43