I am getting a vector subscript out of range when I try to load Obj files that doesnt have normals or texture coordinates. If I load an obj file that has normals and texture coordinates, then everything works fine. So I am just asking how can I modify my code to load obj files without normals and texture coordinates.
this is my struct to store VERTEX data
struct VERTEX
{
XMFLOAT3 position;
XMFLOAT3 normal;
XMFLOAT2 texcoord;
};
and these vertex's are used to store the data in
std::vector<XMFLOAT3> m_position;
std::vector<XMFLOAT2> m_texCoords;
std::vector<XMFLOAT3> m_normals;
std::vector<VERTEX> m_vertices;
std::vector<DWORD> m_Indices;
VERTEX vertex;
This is how I read the lines for my faces. This all works fine but I am not sure if it may be the cause to why my program breaks when loading obj files without normals and texture coordinates.
else if (strcmp(buffer, "f") == 0)
{
int fPosition, fTexCoord, fNormal; //data of a single vertex
for (int iFace = 0; iFace < 3; iFace++)
{
ZeroMemory(&vertex, sizeof(VERTEX));
file >> fPosition;
vertex.position = m_position[fPosition - 1];
if (file.peek() == '/')
{
file.ignore();
if (file.peek() != '/')
{
file >> fTexCoord;
vertex.texcoord = m_texCoords[fTexCoord - 1];
}
if (file.peek() == '/')
{
file.ignore();
file >> fNormal;
vertex.normal = m_normals[fNormal -1];
}
m_vertices.push_back(vertex);
m_Indices.push_back(m_vertices.size() - 1);
}
}
}
and finally this is how I set up my vertex buffer
D3D11_INPUT_ELEMENT_DESC vertlayout[] =
{
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{ "NORMAL", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 24, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{ "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 36, D3D11_INPUT_PER_VERTEX_DATA, 0 },
};
ID3D11Device* pDevice = m_shader->GetDevice();
ID3DBlob* pVSBlob = m_shader->GetVSBlob();
pDevice->CreateInputLayout(vertlayout, ARRAYSIZE(vertlayout), pVSBlob->GetBufferPointer(),
pVSBlob->GetBufferSize(), &m_vertexLayout);
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(VERTEX)* m_vertices.size();
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bd.CPUAccessFlags = 0;
D3D11_SUBRESOURCE_DATA initData;
ZeroMemory(&initData, sizeof(initData));
initData.pSysMem = &m_vertices[0]; //This is where it breaks
pDevice->CreateBuffer(&bd, &initData, &m_vertexBuffer);
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(DWORD)* m_Indices.size();
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;
initData.pSysMem = &m_Indices[0]; //and it breaks here too
pDevice->CreateBuffer(&bd, &initData, &m_indexBuffer);
}
as the comment says, on the lines with initData.pSysMem = &m_vertices[0];
and initData.pSysMem = &m_Indices[0];
, my program will break and I will get the vector subscript out of range error. if I remove the [0] from &m_vertices and &m_Indices, it wont break however nothing will get rendered.
I would just like to know what can I do to have my obj loader load obj files that doesnt need to have always all 4 vertices, normals, texture coordinates and faces.