4

My app from time to time initializes a bunch of DirectX stuff and loads scenes, sometimes containing some large textures (up to 200–300 MB per texture). At first, everything works fine, but after a while FromMemory() just stops working, but only for big textures:

SlimDX.Direct3D11.Direct3D11Exception: E_FAIL: An undetermined error occurred (-2147467259)
  at SlimDX.Result.Throw[T](Object dataKey, Object dataValue)
  at SlimDX.Result.Record[T](Int32 hr, Boolean failed, Object dataKey, Object dataValue)
  at SlimDX.Direct3D11.ShaderResourceView.ConstructFromMemory(Device device, Byte[] memory, D3DX11_IMAGE_LOAD_INFO* loadInformation)
  at SlimDX.Direct3D11.ShaderResourceView.FromMemory(Device device, Byte[] memory)

Of course, I dispose all previously loaded ShaderResourceViews loaded before loading a new scene. But FromMemory() starts working again only after app’s restart. Could you please tell me what else could be wrong?

UPD:

With Texture2D.FromMemory(), I get this:

System.Runtime.InteropServices.SEHException (0x80004005): External component has thrown an exception.
  at D3DX11CreateTextureFromMemory(ID3D11Device* , Void* , UInt32 , D3DX11_IMAGE_LOAD_INFO* , ID3DX11ThreadPump* , ID3D11Resource** , Int32* )
  at SlimDX.Direct3D11.Resource.ConstructFromMemory(Device device, Byte[] memory, D3DX11_IMAGE_LOAD_INFO* info)
  at SlimDX.Direct3D11.Texture2D.FromMemory(Device device, Byte[] memory)

And with native code debugging enabled:

Exception thrown at 0x748AA882 in app.exe: Microsoft C++ exception: std::bad_alloc at memory location 0x00AFC7C8.
Exception thrown: 'System.Runtime.InteropServices.SEHException' in SlimDX.dll

Sadly, I have no idea how D3DX11CreateTextureFromMemory() actually works and why does it try to re-allocate memory. Maybe it’s time to move to x64…

Surfin Bird
  • 488
  • 7
  • 16
  • Are you using 32-bit (x86) or 64-bit native? You could be fragmenting video memory or getting an 'out of memory' condition but getting the error in an unusual place (hence ``E_FAIL``). Try looking for any output from the debug device. – Chuck Walbourn Mar 20 '17 at 16:22
  • @ChuckWalbourn 32-bit, but it usually doesn’t take more than ≈200 MB RAM (apart from those times when it loads a scene, but there is usually only one huge 200 MB texture per scene — for 360° panorama). Sadly, I can’t find anything informative in debug. `EnableObjectTracking` and `DetectDoubleDispose` are both enabled, and it looks like everything is disposed properly. – Surfin Bird Mar 20 '17 at 17:04
  • Check out the max_texture size of your device useuly it does not load more then 192MB texture. – Nain Mar 23 '17 at 13:07
  • 1
    Try to create your device using DeviceCreationFlags.Debug, and in your project properties (assuming you are using visual studio 2015), in the Debug tab, tick the "Enable native code debugging" option. That will give you detailed error reports in the output window (posting those messages will help us troubleshoot you problem) – mrvux Mar 26 '17 at 22:43
  • @catflier, thanks, now it’s a bit more clear what’s going on. – Surfin Bird Mar 27 '17 at 20:04
  • Could you paste your code for texture creation too? Might be able to see if there's an obvious error there also? – mrvux Mar 27 '17 at 21:04

1 Answers1

2

Found the problem. Turns out all I had to do is to add “LARGEADDRESSAWARE” flag to executable. Without it, 1 GB was the limit — quite easily achievable with 300 MB per texture.

Also, of course, since most of that data ended up in Large Object Heap, GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce helped as well.

Sorry for wasting your time.

Surfin Bird
  • 488
  • 7
  • 16