0

Suppose that we have a triangle mesh without information about normals and texture coordinates. (Basically an OBJ file with only vertices and face elements).

The objective is to show something decent using Opengl with a program written in C. To calculate the normals of every triangle is easy...

But what about texture mapping? Can anyone recommend me a simple algorithm/documentation/resource to map the normalized UV coordinates of an image to a generic mesh of triangles?

(For a mesh with a single triangle it is easy, ex:  [0][0], [1][0], [0][1]) 

The result doesn't have to be perfect, even professional softwares can't do that without UV unwrapping and UV seams.

genpfault
  • 51,148
  • 11
  • 85
  • 139
peppone
  • 85
  • 6

1 Answers1

0

The only algorithm I know is for 2D screen coordinates (screen space):

I already answered a question similar to this here, focus on the algorithm (ie., texturePos = (vPos - 0.5) * 2) of conversion between textureCoords and 2D vertices

EDIT:

Note; The following is a theory:

There might be a method with 3D space. Eventually the transformations lead to the vertices being rendered in 2D screen coordinates.

local space --> world space --> view space --> NDC space --> screen coordinates

Using the general convention above and the 3 matrices (Model, View, Projection),

and since the vertices will end up in 2D space, you could create some form of algorithm to back track the textureCoordinates using the inverse Matrices back to 3D space and move on from there.

This, btw, still is not a defined and perfect algorithm (maybe there is and someone will edit and add the algorithm here in the future...)

Dstarred
  • 321
  • 2
  • 10
  • You could have written that in a comment... If I asked the question and didn't add any resources, it's just because I can't find anything. – peppone Aug 06 '21 at 07:30
  • Asking for a general algorithm will be very hard to find, because algorithm are suited to a certain context or situation, they may not be applicable universally, sometimes even to other problems related closely but not 100%. Anyways I going try my best and edit the question to give you a more detailed answer – Dstarred Aug 06 '21 at 07:35
  • One of the key reasons `textureCoordinates` do not have a algorithm is because they do not relate to the `vertices` or `normals` **closely** (not enough info to be derived) whereas the `normals` are a per-vertex or per-triangle basis (if the a triangle made of three vertices, was considered a plane, you could use basic vector math to calculate the normals). – Dstarred Aug 06 '21 at 07:46