2

I'm trying to implement ray-picking in OpenGL ES 2.0 to determine if an object has been clicked or not. So far I'm just trying to check if a specific triangle has been pressed. I`m using this site as a motivation http://android-raypick.blogspot.ca/2012/04/first-i-want-to-state-this-is-my-first.html

This is what I have so far:

public void onClick(float x, float y)
{
    float[] temp = new float[4];
    float[] temp2 = new float[4];
    System.out.println("X coordinate: " + x);
    System.out.println("Y coordinate: " + y);
    float[] pos = new float[4];

    y = (float) viewport[3] - y;

    int res = GLU.gluUnProject(x, y, 1.0f, 
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0, 
            temp, 0);

    Matrix.multiplyMV(temp2, 0, mMVPMatrix, 0, temp, 0);
    float[] nearCoOrds = new float[3];

    if(res == GLES20.GL_TRUE)
    {
        nearCoOrds[0] = temp2[0] / temp2[3];
        nearCoOrds[1] = temp2[1] / temp2[3];
        nearCoOrds[2] = temp2[2] / temp2[3];
        System.out.println("Near0: " + nearCoOrds[0]);
        System.out.println("Near1: " + nearCoOrds[1]);
        System.out.println("Near2: " + nearCoOrds[2]);
    }

    res = GLU.gluUnProject(x, y, 0,
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0,
            temp, 0);

    Matrix.multiplyMV(temp2,0,mMVPMatrix, 0, temp, 0);
    float[] farCoOrds = new float[3];

    if(res == GLES20.GL_TRUE)
    {
        farCoOrds[0] = temp2[0] / temp2[3];
        farCoOrds[1] = temp2[1] / temp2[3];
        farCoOrds[2] = temp2[2] / temp2[3];
        System.out.println("Far0: " + farCoOrds[0]);
        System.out.println("Far1: " + farCoOrds[1]);
        System.out.println("Far2: " + farCoOrds[2]);
    }

    float[] coords = new float[3];

    coords[0] = farCoOrds[0]-nearCoOrds[0];
    coords[1] = farCoOrds[1]-nearCoOrds[1];
    coords[2] = farCoOrds[2]-nearCoOrds[2];

    System.out.println("REAL COORDS 0: " + coords[0]);
    System.out.println("REAL COORDS 1: " + coords[1]);
    System.out.println("REAL COORDS 2: " + coords[2]);

}

The x and the y float is the x and the y coordinate of where the finger pressed the screen. The function onClick is called from the MainActivity.

In

 GLU.gluUnProject(x, y, 1.0f, 
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0, 
            temp, 0);

mMVPMatrix is the Modelview Matrix. mProjectionMatrix is the Projection Matrix and the viewport has the values {0,0,screenhwidth,screenheight}.

Example of output I get is (touched around middle of the screen):

REAL COORDS 0: -0.21542415
REAL COORDS 1: 0.31117013
REAL COORDS 2: 9.000003

My question/topic is that I don't know if I'm on the right track here? Have I gotten the right idea or does it seem that I have misunderstood something? Or are there any other ways I can achieve touch detection on triangles?

Thanks for any help or guidance!

Araw
  • 2,410
  • 3
  • 29
  • 57

2 Answers2

1

There is an excellent OpenGL framework for Android called Rajawali. It supports object picking, the sample code looks very simple, you should try it.

molnarm
  • 9,856
  • 2
  • 42
  • 60
  • 1
    Thanks for your reply, but I was hoping to be able to implement things myself, without the usage of another framework than what is offered from the Android API :) – Araw Jan 13 '13 at 12:10
0

I believe you many have misunderstood a bit (or maybe I have^^). The near and far coordinates are used to construct your ray segment to testing against your polys/hitboxes. The segment is in whats called world space and to test against your vertices/models you will need to convert them from model space(what they are in when loaded) to world space by multiplying them by their mModelView matrix (has all of their transforms, rotations, etc in it). That article you linked seems to take it through that point^^ Make sense or have I missed something?

bonus link for you: http://www.siggraph.org/education/materials/HyperGraph/raytrace/rtinter0.htm

Matthew Clark
  • 571
  • 1
  • 9
  • 33
  • Yeah, but the x and y coordinates are from a touch event. What should be the modelmatrix for that? That's what confuses me a bit... – Araw Jan 15 '13 at 19:30
  • You don't need a modelmatrix for it. That is in world space. You need to convert your mesh data you are testing to world space using it's modelview matrix.^^ (matrixGrabber.mModelView from the example you linked) – Matthew Clark Jan 15 '13 at 21:25
  • Thanks for your reply, haven't been online for a while...So you mean that I only need to convert the coordinates of the triangle to World Space and then check if the finger touch intersects with that? – Araw Jan 18 '13 at 08:55
  • basically^^ The near and far coords you are getting are the start and the end of the "touch ray" in world space, so you should be able to move your model/triangle to world space and then test if the ray-segment intersects with it and bob's your uncle!^^ – Matthew Clark Jan 18 '13 at 21:09