Ive just started developing for the windows mixed reality headset in unity and it seems to be going well; until i build the program.
The point of my game is simple, one player navigates through a maze in VR and another watches the monitor and guides them through.
In the unity editor under the "game" tab, the cameras work as expected. I used RenderTextures to display two cameras (one of the VR view and one an overview of the entire maze) onto a canvas, which was the game view.
However, when i build my game the only thing that appears on the monitor is the VR's perspective.
I have set the target eye for the vr camera to "both" and the main camera to "None (main display) as others have suggested, but no luck.
Is this a small error ive overlooked or is there a larger problem?
Alex.