3

I want to be able to click the touch screen and use the point touched as the starting coordinate for a ray to be used for picking.

How do I convert the point returned from touching the screen into something I can use in the GL world coordinates?

A search brings up lots of confusing possibilities, including the use of gluUnProject with lots of reports about whether it is supported and how to port it.

Can someone lay it out straight for me please?

I'm using Objective C, Xcode and I'm compiling for iphone.

Dave
  • 161
  • 1
  • 4
  • 18

1 Answers1

3

Step 0: Get gluUnproject:

The reports of needing it are true. That function does all the heavy lifting for you. I know at one point the MESA project had an implementation the worked almost perfectly on iOS without modifications. I'm not sure if that's still available. Barring that, you'll just have to do some research on it and either roll your own or port someone else's. It's a bit heavy on the Linear Algebra, so good luck.

Step 1: Convert from UIKit coordinates to OpenGL coordinates:

This normally involves two things:

  1. Flip the Y-coordinate, because UIKit likes its origins in the top left, whereas OpenGL likes its origins in the bottom left.

    touchLocation.y = [[self view] bounds].size.height - touchLocation.y;
    
  2. Convert from "Screen Units" to pixels. This keeps things consistent across standard and retina display devices.

    CGFloat scale = [[UIScreen mainScreen] scale];
    touchLocation.y *= scale;
    touchLocation.y *= scale;
    

Step 3: Use gluUnproject on your converted coordinate:

gluUnproject() technically converts a 3D point in window space to a 3D point in world space. So, to get a ray, you'll need to call it twice: once for the near clipping plane and once for the far clipping plane. That will give you two points, from which you can get a ray. To call gluUnproject(), you'll need access to your 2D view coordinate, the current OpenGL viewport, the current OpenGL model view matrix, and the current OpenGL projection matrix. Pseudocode:

Vector3 near, far;
gluUnProject(touchLocation.x, touchLocation.y, 0, _modelview, _projection, _viewport, &near.x, &near.y, &near.z);
gluUnProject(touchLocation.x, touchLocation.y, 1, _modelview, _projection, _viewport, &far.x, &far.y, &far.z);
return MakeRay(near, Vector3Subtract(far, near));
Matt Wilding
  • 20,115
  • 3
  • 67
  • 95
  • I've implemented gluUnproject now. In your pseudocode is viewportcoord the touched screen coordinate after step 1? Why is the far point of the ray calculated from far-near? – Dave Jan 24 '12 at 19:24
  • @Dave Yes, viewportcoord is the tap location in the viewport. I edited the code for clarity. Mathematically, a ray is defined as a vector origin, and a vector direction from the origin. So the "far point of the ray" isn't an absolute point, it's the vector from the near point, pointing to the far point, which is computed as (far-near). – Matt Wilding Jan 24 '12 at 20:05
  • My solution isn't perfect but I think that has more to do with my ray collision equation than the ray coordinates. Thank you for explaining the make ray part by the way helped a lot. – Dave Jan 26 '12 at 20:50
  • @Dave, no problem. You're probably right; picking the scene with the ray is far easier to mess up than actually getting the ray in the first place. – Matt Wilding Jan 26 '12 at 21:22