Questions tagged [arkit]

Apple ARKit is Augmented Reality SDK introduced in iOS 11.0, which integrates the iOS or visionOS device RGB camera data, Motion Sensor data and LiDAR Scanner data to produce robust AR experiences.

ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses RGB camera sensor data at 60 fps with data at 1000 fps. These two inputs allow the device running iOS or visionOS to sense how it moves within a room with a high degree of accuracy, and without any additional calibration. With ARKit app, user is able to create and render 3D scenes using , , , and frameworks.

The latest version of ARKit has the following features in its arsenal: People Occlusion with depth channel semantics, LiDAR Sensor support for high quality Depth channel and better Scene Understanding, live Motion Capture allowing animate a 3D skeleton, simultaneous Front and Rear camera tracking, hand tracking, ability to track up to 3 faces with a TrueDepth camera, collaborative sessions between 6 users, Geo Tracking and many other useful features.

Reference

3311 questions
10
votes
1 answer

How to precompile PBR shaders for SceneKit?

I've noticed that if you have an empty scene and then load a 3D model in it with physically based lighting, there is a small bit of jank as the object appears. If I then add a different object in after that, that stuttering does not occur. Looking…
pushmatrix
  • 726
  • 1
  • 9
  • 23
10
votes
1 answer

How do you play a video with alpha channel using AVFoundation?

I have an AR application which uses SceneKit, and imports a video on to scene using AVPlayer and thereby adding it as a child node of an SKVideo node. The video is visible as it is supposed to, but the transparency in the video is not achieved. Code…
SWAT
  • 1,107
  • 8
  • 19
10
votes
3 answers

Apple Vision image recognition

As many other developers, I have plunged myself into Apple's new ARKit technology. It's great. For a specific project however, I would like to be able to recognise (real-life) images in the scene, to either project something on it (just like…
10
votes
7 answers

Xcode 9 GM - Export and Upload to App Store crashing

I'm trying to upload my app that uses ARKit (Unity build) to iTunes Connect for TestFlight distribution. While both exporting and uploading to app store processes from Xcode -> Organizer I see crash on "Stripping extended attributes for…
Wojciech Rutkowski
  • 11,299
  • 2
  • 18
  • 22
10
votes
3 answers

ARKit and Vision frameworks for Object Recognition

I would really like some guidance on combining Apple's new Vision API with ARKit in a way that enables object recognition. This would not need to track the moving object, just recognize it stable in 3d space for the AR experience to react…
cnzac
  • 435
  • 3
  • 13
10
votes
3 answers

Is it possible to run ARKit in simulator?

I tried to run sample ARKit in Xcode 9 iPhone 7 plus simulator. But it failed. Is there any possibilities to check ARKit projects in iOS simulators.
Karthikeyan Bose
  • 1,244
  • 3
  • 17
  • 26
10
votes
2 answers

ARKit vs SceneKit coordinates

I'm trying to understand the difference between the different element introduced in ArKit and their maybe equivalents in SceneKit: SCNNode.simdTransform vs SCNNode.transform. In ARKit, it seems that people use SCNNode.simdTransform instead of…
Guig
  • 9,891
  • 7
  • 64
  • 126
9
votes
1 answer

ARKit and RealityKit - ARSessionDelegate is retaining 14 ARFrames

I am classifying images per frame from ARSession delegate by Vision framework and CoreML in an Augmented Reality app, with ARKit and RealityKit. While processing a frame.capturedImage I am not requesting another frame.capturedImage for…
Tanvirgeek
  • 540
  • 1
  • 9
  • 17
9
votes
0 answers

ARKit – Apply CIFilter to a specific vertices of ARFaceAnchor

I'm trying to apply a CIFilter to a specific group of vertices from the ARFaceAnchor. I was able to access the vertices using: let vertices = faceAnchor.geometry.vertices let relevantVertices = [vertices[580], …
Roi Mulia
  • 5,626
  • 11
  • 54
  • 105
9
votes
1 answer

Save ARFaceGeometry to OBJ file

In an iOS ARKit app, I've been trying to save the ARFaceGeometry data to an OBJ file. I followed the explanation here: How to make a 3D model from AVDepthData?. However, the OBJ isn't created correctly. Here's what I have: func renderer(_…
Daniel McLean
  • 451
  • 7
  • 10
9
votes
2 answers

How do I rotate an object around only one axis in RealityKit?

I'm trying to rotate a cube around its z-axis but I can't find how. Is there a way in RealityKit to do this?
Robbe Verhoest
  • 401
  • 4
  • 9
9
votes
2 answers

Transforming ARFrame#capturedImage to view size

When using the ARSessionDelegate to process the raw camera image in ARKit... func session(_ session: ARSession, didUpdate frame: ARFrame) { guard let currentFrame = session.currentFrame else { return } let capturedImage =…
Ralf Ebert
  • 3,556
  • 3
  • 29
  • 43
9
votes
1 answer

How to improve People Occlusion in ARKit 3.0

We are working on a demo app using people's occlusion in ARKit. Because we want to add videos in the final scene, we use SCNPlanes to render the video using a SCNBillboardConstraint to ensure they are facing the right way. These videos are also…
vrwim
  • 13,020
  • 13
  • 63
  • 118
9
votes
1 answer

Convert ARFrame's captured image to UIImage orientation issue

I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest to get coordinates in sceneView. I have been converting CVPixelBuffer to UIImage using following…
Alok Subedi
  • 1,601
  • 14
  • 26
9
votes
1 answer

Create a CMSampleBuffer from a CVPixelBuffer

I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a…
Shai Balassiano
  • 997
  • 2
  • 10
  • 21