0

What I am trying to do is create custom model and texture files that are simple to load and use. Basically the files are lists of floats representing the XYZ and RGB values. So like: 0.5 0.234 0.1 ... ...

Problem is I can't get my array of floats to work for the texture. Here is how I define my array:

float* textureMap;

Here is how I initialize it:

const int SIZE = (128*128*3);

textureMap = (float*)malloc(sizeof(textureMap)*SIZE);

for (int i =0; i<SIZE; i++) { textureMap[i] = 0.0f; }

Now using Glut I have created a window that allows me to paint and fill the array with data, and as you can see all RGB values have been initialized to 0.0f so I would at least expect to see my object as black but it just remains a default grey color and never becomes the same colours as in my texture array.

Here is my call to create the texture:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 130, 130, 0, GL_RGB, GL_FLOAT, textureMap);

I have made the width and height 2^n + 2 as per the guidelines on the OpenGl official webpage though I am not sure that this is correct given how I am trying to build my array of floats.

I have also tried a call to glGetError() with no success (that is, no errors are thrown and I have ensured that I can throw errors by setting width and height to -1).

I have made sure that I am binding the texture before my call to glBegin() and have even checked these calls for errors to no avail.

Any suggestions/pointers? Has anyone else tried to define their own texture formats before?

BTW am using quads instead of triangles at the moment, that's fine right?

genpfault
  • 51,148
  • 11
  • 85
  • 139
Tdiddy
  • 31
  • 6
  • Here's a little more info about how I am creating/using the textures: `GLuint textures[1];` `glGenTextures(1, &textures[0]); glBindTexture(GL_TEXTURE_2D, textures[0]);` `glBindTexture(GL_TEXTURE_2D, textures[0]);` – Tdiddy May 31 '11 at 12:36
  • width and height are 2^n + 2*border. However you don't have a border so with and height are just 128. – datenwolf May 31 '11 at 14:16
  • perhaps I should mention that am using mac osX, have tried reloading the texture with glTexImage2D right before drawing the quads along with enabling and disabling gl_texture_2d before and after my call to draw the quads with no success, any parameters that i may not be setting? am trying to follow online examples but so far nothing works properly, have been adapting code from school but we use linux machines there and have had trouble going back and forth before. Am looking into removing some of my other calls to try and fix this. – Tdiddy Jun 03 '11 at 13:05
  • seeing a minimal working (or in your case broken) example in full code would speak a thousand words. Otherwise it's just wild guesswork – datenwolf Jun 03 '11 at 14:36

2 Answers2

2

First, make sure you have enabled texture mapping by calling glEnable(GL_TEXTURE_2D) before rendering (I assume you don't use shaders at the moment, otherwise look there for errors).

Second, creating a texture of size 130x130 and filling it with data of size 128x128 is definitely wrong. You seem to have misunderstood those guidelines (perhaps they said something about texture border, but in your example you don't have any border, so 128x128 should be fine).

Christian Rau
  • 45,360
  • 10
  • 108
  • 185
  • Thanks for the response, I have indeed called the glEnable in my init function and have tried using the 128 as well (will switch it back), I also have the following lines in my init before entering glut's main loop: `glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);` And no, i am not using shaders, anything else you can think of? – Tdiddy May 31 '11 at 12:29
0

SOLVED! The problem was that I needed to call glTexImage2D(...) in the same function that draws the polygons (doh!). What I had been doing was whenever my texture was edited in the painting window i called glTexImage2D(...) in that same function then told the 3D window to refresh using glutPostWindowRedisplay(...).

Ironically, calling glTexImage2D(...) in the init function also works but only the first time for obvious reasons.

Thanks everybody!

Andrei Sfat
  • 8,440
  • 5
  • 49
  • 69
Tdiddy
  • 31
  • 6
  • That's probably not the real problem, but a symptom. It doesn't matter where you call `glTexImage2D`, as long as the OpenGL context is active and the texture is currently bound (by `glBindTexture`). So I guess that was not the case. Setting the texture image in every frame before drawing is definitely not a good idea regarding performance. – Christian Rau Jun 11 '11 at 15:50