What I am trying to do is create custom model and texture files that are simple to load and use. Basically the files are lists of floats representing the XYZ and RGB values. So like: 0.5 0.234 0.1 ... ...
Problem is I can't get my array of floats to work for the texture. Here is how I define my array:
float* textureMap;
Here is how I initialize it:
const int SIZE = (128*128*3);
textureMap = (float*)malloc(sizeof(textureMap)*SIZE);
for (int i =0; i<SIZE; i++) {
textureMap[i] = 0.0f;
}
Now using Glut I have created a window that allows me to paint and fill the array with data, and as you can see all RGB values have been initialized to 0.0f so I would at least expect to see my object as black but it just remains a default grey color and never becomes the same colours as in my texture array.
Here is my call to create the texture:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 130, 130, 0, GL_RGB, GL_FLOAT, textureMap);
I have made the width and height 2^n + 2 as per the guidelines on the OpenGl official webpage though I am not sure that this is correct given how I am trying to build my array of floats.
I have also tried a call to glGetError() with no success (that is, no errors are thrown and I have ensured that I can throw errors by setting width and height to -1).
I have made sure that I am binding the texture before my call to glBegin() and have even checked these calls for errors to no avail.
Any suggestions/pointers? Has anyone else tried to define their own texture formats before?
BTW am using quads instead of triangles at the moment, that's fine right?