1

This is not a texture related problem as described in other StackOverflow questions: Rendering to texture on iOS...

My Redraw loop:

glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

glTranslatef(0.0f, 0.0f, -300.0f);
glMultMatrixf(transform);

glVertexPointer(3, GL_FLOAT, MODEL_STRIDE, &model_vertices[0]);
glEnableClientState(GL_VERTEX_ARRAY);

glNormalPointer(GL_FLOAT, MODEL_STRIDE, &model_vertices[3]);
glEnableClientState(GL_NORMAL_ARRAY);

glColorPointer(4, GL_FLOAT, MODEL_STRIDE, &model_vertices[6]);
glEnableClientState(GL_COLOR_ARRAY);
glEnable(GL_COLOR_MATERIAL);
glDrawArrays(GL_TRIANGLES, 0, MODEL_NUM_VERTICES);

The result in the simulator:

Image rendered in the Simulator

Then the result in the IPhone 4 (iOS5 using OpenGLES 1.1): Image rendered in the Device

Notice the black dots, they are random as you rotate the object (brain)

The mesh has 15002 vertices and 30k triangles.

Any ideas on how to fix this jitter in the Device image?

Community
  • 1
  • 1
ppaulojr
  • 3,579
  • 4
  • 29
  • 56

2 Answers2

3

I've solved the problem increasing the precision of depth buffer:

// Set up the projection
static const GLfloat zNear = 0.1f, zFar = 1000.0f, fieldOfView = 45.0f;
glEnable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
GLfloat size = zNear * tanf(DEGREES_TO_RADIANS(fieldOfView) / 2.0);
CGRect rect = self.bounds;
glFrustumf(-size, size, -size / (rect.size.width / rect.size.height), size / (rect.size.width / rect.size.height), zNear, zFar);
glViewport(0, 0, rect.size.width, rect.size.height);
glMatrixMode(GL_MODELVIEW);

In the code that produced the jitter the zNear was 0.01f

The hint came from devforums.apple.com

ppaulojr
  • 3,579
  • 4
  • 29
  • 56
2

There's nothing special in the code you posted that would cause this. The problem is likely in your mesh data rather than in your code, due to precision limitations on the processing of the vertices in your model. This type of problem is common if you have adjacent triangles that have close, but not identical, values for the positions of the vertices they share. It's also the type of thing that will commonly vary between a gpu and a simulator.

You say that the black dots flash around randomly as you rotate the object. If you're rotating the object, I assume your real code isn't always loading the identity matrix in for the model-view?

If the gaps between your triangles are much smaller than the projected size of one pixel then usually they will end up being rounded to the same pixel and you won't see any problem. But if one vertex is rounded in one direction and the other vertex being rounded in the other direction then that can leave a one-pixel gap. The locations of the rounding errors will vary depending on the transform matrix, so will move every frame as the object rotates.

If you load a different mesh do you get the same errors?

If you have your brain mesh in a data format that you can edit in a 3D modeling app, then search for an option named something like "weld vertices" or "merge vertices". You set a minimum threshold for vertices to be considered identical and it will look for vertex pairs within that distance and move one (or both) to match perfectly. Many 3D modelling apps will have cleanup tools to ensure that a mesh is manifold, which means (among other things) that there are no holes in the mesh. You usually only want to deal with manifold meshes in 3D rendering. You can can also weld vertices in your own code, though the operation is expensive and not usually the type of thing you want to do at runtime unless you really have to.

Alan
  • 4,897
  • 2
  • 24
  • 17
  • Unfortunately I've done all those steps in the MeshLab. I'm hitting my head on this for several days... – ppaulojr Oct 16 '11 at 16:49