My DirectX11 C++ Engine uses uint16_t (short) for the vertex index buffer and all was working well.
I've evolved the models I use and now they have grown with over 64k indexes.
I've changed all references to my index buffer from short to uint32_t and the render was broken.
My variable defines are:
ID3D11Buffer *IndexBuffer; //DirectX Index Buffer
vector<int32_t> primitiveIndices; //Vector array of indicies formally
I finally changed the line
Context->IASetIndexBuffer(IndexBuffer, DXGI_FORMAT_R16_UINT, 0);
to
Context->IASetIndexBuffer(IndexBuffer, DXGI_FORMAT_R8G8B8A8_UINT, 0);
This was done to allow 32bit indexes. However it fails to render. I have also updated the
D3D11_BUFFER_DESC::ByteWidth
accordingly.
Any advice welcome.