3

I am trying to use an SCNShaderModifier to re-project the texture coordinates of a mesh in SceneKit on iOS from an arbitrary camera in the scene which is parented to the geometry.

guard let faceGeometry = ARSCNFaceGeometry(device: device, fillMesh: true) else { return }
let faceNode = SCNNode()
faceNode.geometry = faceGeometry
sceneRoot.addChildNode(faceNode)

let textureCameraNode = SCNNode()
textureCameraNode.camera = SCNCamera()
textureCameraNode.position = SCNVector3(1, 1, 1)
textureCameraNode.look(at: SCNVector3(0, 0, 0))
faceNode.addChildNode(textureCameraNode)

Apple gives an example where you can project the video feed as a texture onto the face using the deviceScreen's view (the main scene camera viewport camera) as the texture projection camera with a vertex shader modifier at the geometry entry point like so:

guard let shaderURL = Bundle.main.url(forResource: "VideoTexturedFace", withExtension: "shader"),
let modifier = try? String(contentsOf: shaderURL) else { fatalError("Can't load shader modifier from bundle.") }
faceNode.geometry?.shaderModifiers = [.geometry: modifier]

let affineTransform = frame.displayTransform(for: .portrait, viewportSize: viewController.ARView!.bounds.size)
let transform = SCNMatrix4(affineTransform)
faceNode.geometry?.setValue(SCNMatrix4Invert(transform), forKey: "displayTransform")

The shader for which is like so:

// VideoTexturedFace.shader //

#pragma arguments
float4x4 displayTransform //from ARFrame.displayTransform(for:viewportSize:)

#pragma body

// Transform the vertex to the camera coordinate system.
float4 vertexCamera = scn_node.modelViewTransform * _geometry.position;

// Camera projection and perspective divide to get normalized viewport coordinates (clip space).
float4 vertexClipSpace = scn_frame.projectionTransform * vertexCamera;
vertexClipSpace /= vertexClipSpace.w;

// XY in clip space is [-1,1]x[-1,1], so adjust to UV texture coordinates: [0,1]x[0,1].
// Image coordinates are Y-flipped (upper-left origin).
float4 vertexImageSpace = float4(vertexClipSpace.xy * 0.5 + 0.5, 0.0, 1.0);
vertexImageSpace.y = 1.0 - vertexImageSpace.y;

// Apply ARKit's display transform (device orientation * front-facing camera flip).
float4 transformedVertex = displayTransform * vertexImageSpace;

// Output as texture coordinates for use in later rendering stages.
_geometry.texcoords[0] = transformedVertex.xy;

How can I use my texture camera in place of the viewport camera in the above example? Please could someone help me by giving me the shader code/swift code I need to accomplish this?

Geoff H
  • 3,107
  • 1
  • 28
  • 53
  • It's not clear to me what you're asking for. Do you have an example or drawing of what you want to achieve? – mnuages Aug 06 '19 at 09:36

0 Answers0