1

I'm working on an application which can recognize an image and then put an AR Node (3D model build with Reality Composer) on top of this image. I want to build this with RealityKit/Reality Composer (which should also support image recognition), but this does not work.

I've already tested if the model worked on a simple horizontal plane, and it did (both in Xcode and the Reality Composer test environment). But when I select an image as anchoring mode, the model does not appear in the Xcode project, while it does appear in the Reality Composer test environment.

I currently use this code to load the Reality Composer project into Xcode:

let arConfiguration = ARWorldTrackingConfiguration()
arConfiguration.planeDetection = .horizontal
arView.session.run(arConfiguration)

guard let anchor = try? Spinner.loadScene() else { return }
arView.scene.anchors.append(anchor)

The expected output would be that - when pointed to the correct image - the model appears.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
MarcWiggerman
  • 101
  • 1
  • 6

2 Answers2

0

I had the same issue using iOS 13 beta. Updating to iOS 13.1 beta did the trick. I can only guess it is something related to the RealityKit on iOS. Please note that updating to iOS 13.1 beta requires also updating XCode 11 beta 7 to support it. Hope it would help you.

Yonatan Vainer
  • 453
  • 5
  • 15
0

I face the same issue and search on google take me here. I'm using Xcode 11.2, iOS 13.1.2. I resolved this issue after adding a simple capsule object in the canvas using built in 'Reality Composer'. I thought it might help someone so I am writing my use case here.

Usman
  • 457
  • 4
  • 8