3

I'm experimenting with ARKit and I'm trying to place some models around the user. So what I want is that when the app starts it just places some models around the user so He needs to find them.

When he moves like for example 10 meters I want to add some random models again. I thought I could do it this way:

 let cameraTransform = self.sceneView.session.currentFrame?.camera.transform
        let cameraCoordinates = MDLTransform(matrix: cameraTransform!)

        let camX = CGFloat(cameraCoordinates.translation.x)
        let camY = CGFloat(cameraCoordinates.translation.y)
        let cameraPosition = CGPoint(x: camX, y: camY)
        let anchors = self.sceneView.hitTest(cameraPosition, types: [.featurePoint, .estimatedHorizontalPlane])

        if let hit = anchors.first {
            let hitTransform = SCNMatrix4(hit.worldTransform)
            let hitPosition = SCNVector3Make(hitTransform.m41, hitTransform.m42, hitTransform.m43)
            self.sceneView.session.add(anchor: ARAnchor(transform: hit.worldTransform))
            return Coordinate(hitPosition.x, hitPosition.y, hitPosition.z)
        }

        return Coordinate(0, 0, 0)
    }

The problem is sometimes it doesn't find any anchors and then I don't know what to do. And when it finds some anchors it is randomly placed behind me not in front of me but behind me. I don't know why because never turn the camera so it can't find any anchors.

Is there a better way to place random models in the real world?

user1007522
  • 7,858
  • 17
  • 69
  • 113

1 Answers1

3

To make it you'll need use session(_:didUpdate:) delegate method:

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    guard let cameraTransform = session.currentFrame?.camera.transform else { return }
    let cameraPosition = SCNVector3(
        /* At this moment you could be sure, that camera properly oriented in world coordinates */
        cameraTransform.columns.3.x,
        cameraTransform.columns.3.y,
        cameraTransform.columns.3.z
    )
    /* Now you have cameraPosition with x,y,z coordinates and you can calculate distance between those to points */
    let randomPoint = CGPoint(
        /* Here you can make random point for hitTest. */
        x: CGFloat(arc4random()) / CGFloat(UInt32.max),
        y: CGFloat(arc4random()) / CGFloat(UInt32.max)
    )
    guard let testResult = frame.hitTest(randomPoint, types: .featurePoint).first else { return }
    let objectPoint = SCNVector3(
        /* Converting 4x4 matrix into x,y,z point */
        testResult.worldTransform.columns.3.x,
        testResult.worldTransform.columns.3.y,
        testResult.worldTransform.columns.3.z
    )
    /* do whatever you need with this object point */
}

It'll allows you place object whenever camera position updates:

Implement this method if you provide your own display for rendering an AR experience. The provided ARFrame object contains the latest image captured from the device camera, which you can render as a scene background, as well as information about camera parameters and anchor transforms you can use for rendering virtual content on top of the camera image.

Really important here, that you're randomly choosing point for hitTest method, and this point always will be in front of camera.

Don't forget to use from 0 to 1.0 coordinate system for CGPoint in hitTest method:

A point in normalized image coordinate space. (The point (0,0) represents the top left corner of the image, and the point (1,1) represents the bottom right corner.)

If you want to place object each 10 meters, you can save camera position (in session(_:didUpdate:) method) and check that x+z coordinates was changed for far enough, to place new object.

NOTE:

I'm assuming that you're using world tracking session:

let configuration = ARWorldTrackingSessionConfiguration()
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
Vasilii Muravev
  • 3,063
  • 17
  • 45
  • Thanks for your explanation. I have some questions. The point for hittesting in your example code the arcRandom should be some number between 0 and 1 you say so the point is always in front of the camera? And then the x+z coordinates to see if the user has moved enough. With what coordinates do I need to compare these? Is that the session.currentFrame?.camera.transform x and z? – user1007522 Jul 19 '17 at 17:48
  • 1
    @user1007522 Yes, `frame: ARFrame` from function parameters is always in front of camera. For comparing distance, use `cameraPosition`(see updated answer ^) . – Vasilii Muravev Jul 19 '17 at 18:12
  • Hey @Vasilii, thank you so much for your detailed answers! I and my team stuck on an issue for days ah :/ I posted an interesting question: https://stackoverflow.com/questions/63662318/arkit-apply-filter-cifilter-to-a-specific-part-vertex-of-an-arfaceanchor Would love your suggestions! – Roi Mulia Aug 31 '20 at 07:30