0

I have created an augmented reality app based on the Vuforia platform. I am modifying it so that if the target is lost, the system will use the last known position of the target, together with device orientation data from CoreMotion to keep the object in the correct position.

I need help with the last bit - integrating the CoreMotion data. I think the the best way to do this would be to rotate the virtual camera based on gyro inputs, but I'm not an OpenGL ES expert. Can someone shed some light on the best way to do this? I know how to get the device orientation data, it's the OpenGL and matrix algebra that I need some guidance on.

My renderFrame method is below.

-(void)renderFrameQCAR {
    [self setFramebuffer];

    // Clear colour and depth buffers
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    // Render video background and retrieve tracking state
    QCAR::State state = QCAR::Renderer::getInstance().begin();
    QCAR::Renderer::getInstance().drawVideoBackground();

    glEnable(GL_DEPTH_TEST);
    glEnable(GL_CULL_FACE);


    // Check if any trackables are visible.
    int numberOfTrackables = state.getNumActiveTrackables();
    QCAR::Matrix44F modelViewMatrix;


    // Skip rendering if there is nothing to render.
    if (numberOfTrackables > 0 || hasPickedUpTrackablePreviously == YES) {

        // If there are none, but one was picked up in the past, use the last pose matrix.
        if (numberOfTrackables == 0 && hasPickedUpTrackablePreviously == YES) {
            modelViewMatrix = trackablePoseMatrix;
        }
        else {
            // Get the trackable
            const QCAR::Trackable* trackable = state.getActiveTrackable(0);
            modelViewMatrix = QCAR::Tool::convertPose2GLMatrix(trackable->getPose());

            // Store these variables for use later.
            trackablePoseMatrix = modelViewMatrix;
            hasPickedUpTrackablePreviously = YES;
        }


        // Fetch the 3D object to render.
        Object3D *obj3D;

        if (currentlyChangingTextures == YES || useDummyModel == YES) {
            obj3D = dummyObject;
        }
        else {
            obj3D = [objects3D objectAtIndex:0];
        }


        // Render using the appropriate version of OpenGL
        // OpenGL 2
        QCAR::Matrix44F modelViewProjection;

        // Apply usual transformations here
        ShaderUtils::translatePoseMatrix(sideToSideFloat, forwardBackFloat, 0.0f, &modelViewMatrix.data[0]);
        ShaderUtils::scalePoseMatrix(kObjectScale * sizeFloat, kObjectScale * sizeFloat, kObjectScale * sizeFloat, &modelViewMatrix.data[0]);
        ShaderUtils::rotatePoseMatrix(0.0f + rotationAngleFloat, 0.0f, 0.0f, 1.0f, &modelViewMatrix.data[0]);


        // Apply our translation vector here based on gesture info from the buttonOverlayViewController
        QCAR::Vec3F translationFromWorldPerspective = SampleMath::Vec3FTransformNormal(translationVectorFromCamerasPerspective, inverseModelViewMatrix);

        translationFromWorldPerspective = SampleMath::Vec3FNormalize(translationFromWorldPerspective);

        theTranslation.data[0] = theTranslation.data[0] + speed*translationFromWorldPerspective.data[0];
        theTranslation.data[1] = theTranslation.data[1] + speed*translationFromWorldPerspective.data[1];
        theTranslation.data[2] = 0.0f;

        ShaderUtils::translatePoseMatrix(theTranslation.data[0], theTranslation.data[1], theTranslation.data[2], &modelViewMatrix.data[0]);

        // Update inverseModelViewMatrix
        inverseModelViewMatrix = SampleMath::Matrix44FInverse(modelViewMatrix);

        // Multiply modelview and projection matrix as usual
        ShaderUtils::multiplyMatrix(&qUtils.projectionMatrix.data[0], &modelViewMatrix.data[0], &modelViewProjection.data[0]);

        glUseProgram(shaderProgramID);

        glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)obj3D.vertices);
        glVertexAttribPointer(normalHandle, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)obj3D.normals);
        glVertexAttribPointer(textureCoordHandle, 2, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)obj3D.texCoords);

        glEnableVertexAttribArray(vertexHandle);
        glEnableVertexAttribArray(normalHandle);
        glEnableVertexAttribArray(textureCoordHandle);

        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, [obj3D.texture textureID]);
        glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE, (const GLfloat*)&modelViewProjection.data[0]);
        glDrawArrays(GL_TRIANGLES, 0, obj3D.numVertices);

        ShaderUtils::checkGlError("EAGLView renderFrameQCAR");
    }


    // Disable these things.
    glDisable(GL_DEPTH_TEST);
    glDisable(GL_CULL_FACE);

    glDisableVertexAttribArray(vertexHandle);
    glDisableVertexAttribArray(normalHandle);
    glDisableVertexAttribArray(textureCoordHandle);

    QCAR::Renderer::getInstance().end();
    [self presentFramebuffer];
}

Thanks!!

That Guy
  • 460
  • 1
  • 7
  • 17

1 Answers1

1

I haven't used Vuforia yet so I don't fully understand your code, but I have successfully created an AR app using the gyroscope and compass to control the camera. Here is my camera matrix code:

GLKMatrix4 cameraMatrix = GLKMatrix4Identity;
cameraMatrix = GLKMatrix4Rotate(cameraMatrix, GLKMathDegreesToRadians((zenith-90)), 1.0f, 0.0f, 0.0f);
cameraMatrix = GLKMatrix4Rotate(cameraMatrix, GLKMathDegreesToRadians(azimuth), 0.0f, 1.0f, 0.0f);

Where: zenith = (180 - gyroscope roll angle) [0 points straight up, 180 points straight down], and azimuth = compass angle (0 N, 90 E, 180 S, 270 W)

And in the vertex shader, glPosition is calculated by:

gl_Position = uProjectionMatrix * uCameraMatrix * vec4(position, 1.0);

I'm using point sprites, but position holds the coordinates of my particles in 3D world space. In your case, I'm guessing this is where you would substitute your other matrices like modelViewMatrix, etc.

Just be careful with your matrix multiplication orders!

Ricardo RendonCepeda
  • 3,271
  • 4
  • 23
  • 30
  • 1
    Thanks for your suggestion. I wasn't able to get this to work (I'm no expert) and am getting additional help with this. If the solution is not too complex I'll post here. – That Guy Jan 06 '13 at 06:22