6

I have an iOS point cloud app and am trying to allow the user to keep adding points to an existing point cloud across multiple sessions.

I have a Metal MTKView, overlaid and aligned with an ARView underneath. When I have the .showSceneUnderstanding enabled, I can verify the point cloud of the metal view matches the mesh of the ARView.

I can restore the world state of ARView by archving and unarchiving of ARWorldMap and assigning it to initialWorldMap. After some device rotation, the scene will match the anchors from the ARWorldMap, and display appropriate scene understanding. I also have a point cloud, which is defined by XYZ.

How can I get the camera within my MTKView to correspond to the current camera orientation within ARView?

In other words, I can restore existing world map, and have it displayed, but have the ARView display the scene from a different, non-original viewport and camera position. When I restore the point cloud, the origin is set to the current device position/orientation, and the point cloud loses alignment with the real world.

How can I take ARView/ARWorldMap/ARFrame camera position and orient my point cloud, so my viewport within MTKView will correctly display the part of a point cloud which corresponds to this viewport of ARView ?

arView.debugOptions.insert(.showSceneUnderstanding)

func preloadWorld(for configuration: ARWorldTrackingConfiguration) {
    
    guard let restoredMap: ARWorldMap = GeometryManager<ARWorldMap>().value else {
        return
    }
    configuration.initialWorldMap = restoredMap
    
    // 
}

Here's the code which captures the camera position:

static func create(with frame: ARFrame, viewportSize: CGSize) -> PointCloudUniforms {
var uniforms = PointCloudUniforms.defaults

uniforms.viewportWidth  = Float(viewportSize.width)
uniforms.viewportHeight = Float(viewportSize.height)

let screenOrientation = UIInterfaceOrientation.portrait

let camera = frame.camera
let cameraIntrinsicsInversed = camera.intrinsics.inverse
let viewMatrix = camera.viewMatrix(for: screenOrientation) //TODO: pass in orientation

let viewMatrixInversed = viewMatrix.inverse
let projectionMatrix = camera.projectionMatrix(for: screenOrientation,
                                               viewportSize: viewportSize,
                                               zNear: 0.01,
                                               zFar: 100.0)

uniforms.viewProjectionMatrix = projectionMatrix * viewMatrix
uniforms.localToWorld = viewMatrixInversed * screenOrientation.rotationToARCameraMatrix
uniforms.cameraIntrinsicsInversed = cameraIntrinsicsInversed

uniforms.cameraResolution = Float2(Float(frame.camera.imageResolution.width), Float(frame.camera.imageResolution.height))

return uniforms
}

Here's the shader which rotates each point cloud point based on the current camera position.

/// This is what is displaying an individual point in the point cloud
vertex ParticleVertexOut particleVertex(uint vertexID [[vertex_id]],
                                        constant PointCloudUniforms &uniforms [[buffer(kPointCloudUniforms)]],
                                        constant ParticleUniforms *particleUniforms [[buffer(kParticleUniforms)]]) {
    // get point data
    const auto particleData = particleUniforms[vertexID];
    const auto position = particleData.position;
    
    // animate and project the point
    float4 projectedPosition = uniforms.viewProjectionMatrix * float4(position, 1.0);
    
    projectedPosition /= projectedPosition.w;
    
    ParticleVertexOut out;
    out.position = projectedPosition;
    
    out.color = float4(particleData.color, 1);
    return out;
}

See example below Aligning a teapot with it's mesh from ARView. enter image description here

Alex Stone
  • 46,408
  • 55
  • 231
  • 407

0 Answers0