Reality Composer is just an app for simple and quick prototyping. Other tasks, beyond Reality Composer's scope, are for RealityKit that allows you programmatically change/set not only simple transform parameters ā like position, rotation and scaling of a model ā but also all the necessary options. Here's a list of what you can change in Reality Composer scene loaded in RealityKit:
- Composing hierarchical structure
- Anchoring types and components
- Collisions' settings
- Geometry shapes
- Shaders and materials
- Raycasting methods
- Physics components and options
- Implement audio and video
- Sync and async loading
- Play animations
- Naming for all entities' types
Moreover, using RealityKit you can programmatically create lights, cameras, and transform animations, implement gestures for models, custom physics, and anchors that aren't supported in Reality Composer at the moment (like .body
, .camera
or .raycastResult
) and other useful stuff.
There are so many things a prototyping app shouldn't do. Especially when it comes to loading methodology, hierarchy customization and rendering. However, I believe that UI for animation and physics will be developed for Reality Composer.
As we all know, Apple never publishes any roadmaps. But many developers impatiently wait for the next major updates of RealityKit, Reality Composer and Reality Converter.