3

How do you calculate UV coordinates for points on a plane?

I have a polygon - 3 or 4 or more points - that is on a plane - that is to say, all the points are on a plane. But it can be at any angle in space.

One side of this polygon - two points - are to be mapped to two corresponding 2D points in a texture - I know these two points in advance. I also know the x and y scale for the texture, and that no points fall outside the texture extent or other 'edge cases'.

Here's an image where the up-most textured quad is distorted:

enter image description here

I outlined a bad quad in yellow. Imagine that I know the UV coordinates of those two bottom-most corners on that quad, and want to calculate the proper UV coordinates of the other two points...

How do you calculate the UV coordinates of all the other points in the plane relative to these two points?

Imagine my texture is a piece of paper in real life, and I want to texture your (flat) car door. I place two dots on my paper, which I line up with two dots on your car door. How do I calculate where the other locations on the car door are under the paper?

Can you use trilateration? What would the pseudo-code look like for two known points in 2D space?


Success using brainjam's code:

def set_texture(self,texture,a_ofs,a,b):
    self.texture = texture
    self.colour = (1,1,1)
    self.texture_coords = tx = []
    A, B = self.m[a_ofs:a_ofs+2]
    for P in self.m:
        if P == A:
            tx.append(a)
        elif P == B:
            tx.append(b)
        else:
            scale = P.distance(A)/B.distance(A)
            theta = (P-A).dot((B-A)/(P.distance(A)*B.distance(A)))
            theta = math.acos(theta)
            x, y = b[0]-a[0], b[1]-a[1]
            x, y = x*math.cos(theta) - y*math.sin(theta), \
                x*math.sin(theta) + y*math.cos(theta)
            x, y = a[0]+ x*scale, a[1]+ y*scale
            tx.append((x,y))

enter image description here

Will
  • 73,905
  • 40
  • 169
  • 246

3 Answers3

2

You have to express the others points in terms of two chosen vectors and an origin.

I would do something like this :

Choose 3 3D points with corresponding UV points :

  • A(x,y,z,u,v)
  • B(x,y,z,u,v)
  • C(x,y,z,u,v)

Then using the x,y,z coords, we want to express a given 3D point D as :

D = A + alpha ( B - A ) + beta ( C - A ) + gamma ( B - A ) X ( C - A )

We have 3 equations for x,y,z, X is cross product, and alpha,beta,gamma are unknown. We want this to create a linear relation between uv and xyz.

Compute W = ( B - A ) X ( C - A ), we need to solve :

Dx - Ax = alpha.(Bx-Ax) + beta.(Cx-Ax) + gamma.Wx

Dy - Ay = alpha.(By-Ay) + beta.(Cy-Ay) + gamma.Wy

Dz - Az = alpha.(Bz-Az) + beta.(Cz-Az) + gamma.Wz

Compute the inverse matrix of matrix M with this method :

       | (Bx-Ax) , Cx-Ax , Wx | 
   M = | (By-Ay) , Cy-Ay , Wy | 
       | (Bz-Az) , Cz-Az , Wz | 

We call the result matrix N, note it does not depend on D.

Then compute alpha,beta,gamma for D by :

(alpha,beta,gamma) = N.(D-A)

Then compute u,v for D by :

Du = Au + alpha( Bu - Au ) + beta( Cu - Au )

Dv = Av + alpha( Bv - Av ) + beta( Cv - Av )

gamma is not used as it's a distance between D and the (A,B,C) 3D plane.

fa.
  • 2,456
  • 16
  • 17
  • very promising, thx! What are alpha beta and gamma? How might it look in c or python pseudo-code, so I can get a grip on what your saying? – Will Mar 17 '11 at 22:49
  • alpha, beta, gamma are real coefficients which express the decomposition of D-A on the base formed by the 3 vectors B-A,C-A, and (B-A)X(C-A). It's a basis change, but not orthonormal. – fa. Mar 17 '11 at 23:43
1

Label the vertices of your 3D polygon in counter-clockwise order, starting with the two vertices whose UV coordinates are known. Call these labels A, B, C, D. The labels of the corresponding vertices in UV space are a, b, c, d, where a and b are known.

The problem you've stated is, for a point P in the original polygon, to determine the corresponding UV coordinate p. (I believe that you only care about calculating the UV coordinates c and d for the points C and D, but the general solution for P is the same.)

First, calculate the angle θ between P-A and B-A. This is easily done using the dot product of the normalized vectors, and acos.

α = (P-A)⋅(B-A)/(|P-A||B-A|)

θ = acos(α)

We also calculate the ratio of the lengths:

σ = |P-A|/|B-A|

Now to calculate p in UV space, we simply rotate the vector b-a by angle θ (keeping a fixed) and scale by σ.

Let R, the matrix for a rotation by angle θ, be

| +cos(θ) -sin(θ) |
| +sin(θ) +cos(θ) |

Then p = a + σR( b-a).

And you're done.

brainjam
  • 18,863
  • 8
  • 57
  • 82
  • @brainjam this is really promising but I can't get my attempt at translating it to my python to work properly; I've updated the question with details, and hope its an obvious bug in my code? – Will Mar 20 '11 at 23:06
  • @Will, try replacing `x, y = a[0]-b[0], a[1]-b[1]` with `x, y = b[0]-a[0], b[1]-a[1]`? – brainjam Mar 21 '11 at 02:49
  • thanks yes; I think I had that originally when I transcribed, I guess I forgot all the random changes I tried due to fatigue when I posted my code in frustration last night! It is no longer rotating the wrong way, but rather smearing as though its not varying on the Y axis or something. I updated the picture in the question appendix. – Will Mar 21 '11 at 06:37
  • @Will, yes, it looks like the Y coordinates are all the same. Don't you love graphics debugging? At this point the best thing is perhaps to print the results of each line in the loop and see what's being computed. And then also print the values of `self.m` and `tx`. If I could see all that I could check it for correctness. You can either post it here, or feel free to look up my email address and send it there. – brainjam Mar 21 '11 at 13:27
  • P(-0.39432,0.37187,0.23580) (1.283621949056725e-17, 1.2096313728906054) P(-0.39432,0.29261,0.39432) (0.0, 1.0) (A) P(0.45113,0.29261,0.39432) (1.0, 1.0) (B) P(0.45113,0.37187,0.23580) (0.99999999999999989, 1.2096313728906052) – Will Mar 21 '11 at 13:39
  • Code is here: https://github.com/williame/GlestNG/blob/master/prototyping/houses.py (without the b-a fix applied) – Will Mar 21 '11 at 13:42
  • @Will, in your second picture, A and B appear to map to (0,1) and (1,1). But in this list they map to (1,1) and (1,1.2). At any rate, it looks like A,B,C,D map to (1,1),(1,1.2),(0,1.2),(0,1). So all the UV's are distinct. But they are 'outside' of the typical unit square, which contradicts your statement "I also know the x and y scale for the texture, and that no points fall outside the texture extent" You say you are using GL_REPEAT, so that should be OK. But is it possible your software downstream is taking the coordinates "modulo 1", leaving you with (0,0),(.99,0.2),(0,0.2),(0,0)? – brainjam Mar 21 '11 at 13:58
  • @Will, could you try changing `uv_a, uv_b = (0.,1.), (1.,1.)` to `uv_a, uv_b = (0.,0.), (1.,0.)` This will map the edge of the roof to the bottom edge of the texture instead of the top edge, and will keep the roof UVs inside the texture square. Let me know what happens. – brainjam Mar 21 '11 at 14:42
  • Complete success! It was that the GL_REPEAT was, for some reason, not sticking somehow. I'll have to debug that. But the tiling now works. Thx for following-up as I got stuck, well worth 100pts! – Will Mar 21 '11 at 18:35
0

U and V are numbers between 0 and 1.

So, say in your situation the size of the larger edge is 10, and the smaller edge is 5, each "gap" is 2.5. This is then normalized to give you a digit that is needed.

so some example pseudo code:

bottomLeftVector(0,0,0)
bottomLeftTexture(0,0)
topLeftVector(2.5, 5, 0)
topLeftTexture(0.25, 0)
topRightVector(7.5, 5, 0)
topRightTexture(0, 0.75)
bottomRightVector(10, 0, 0)
bottomRightTexture(1,1)

Hope this helps!

D Hansen
  • 794
  • 1
  • 6
  • 16
  • you've done it in 2D. Every texturing demo starts with plastering textures on a cube with nice square sides. The problem is computing the 2D coordinates on the plane that a bunch of 3D points lie on. – Will Mar 17 '11 at 21:40
  • well it's the same generic problem if working with 2D textures. Distance between the two points that are causing the problem need to be found from the 3D shape and then place into a 2D texture. This is how I've solved the same problem myself. – D Hansen Mar 17 '11 at 21:43
  • so how did you go from 3D points to corresponding 2D points on the plane? – Will Mar 17 '11 at 21:47
  • well if your object is circular, then the 3D points don't need to be taken into account. This is because you are mapping a 2D texture to it. So just see the face you are rendering the texture to as a 2D shape. This makes the job a lot easier to figure out. – D Hansen Mar 17 '11 at 21:49
  • or another way of say it is that if texture mapping a 2D texture to a 3D object, you map each face, not each vertex. – D Hansen Mar 17 '11 at 22:04