I'm currently working on something that would allow me to stream "what the headset is seeing" on another device's browser. I'm working with A-frame (v 1.2.0) experiences and an Oculus Quest. I used Socket.io & Web rtc to establish the connection and send the stream between the participants, everything works pretty well to this point.
I based my code on this example to stream the content of the canvas :
https://developers.google.com/web/updates/2016/10/capture-stream
Like this : document.querySelector("canvas").captureStream(30);
And I send this with webrtc once socket established the peer connection. Everything works fine, I can see from the browser (viewer side) the scene through the camera I defined :
<a-entity id="rig" position="0 1.6 5">
<a-entity id="head" camera look-controls">
</a-entity>
</a-entity>
The stream follows pretty good when we move the camera with mouse-dragging in 2D mode (on a PC browser or before entering VR Mode on the headset)
But when the headset user enters VR mode, the camera does not follow the user's eyes. (But I can see my raycaster's laser moving when moving my controller)
I made some tests to understand and I saw that "manually updating" the camera's rotation is working : I made a component that allows me to rotate the camera's Y axis with the joystick and it works well on the stream as well, I mean it proves that I'm referring to the good camera. Does it mean that the headset isn't updating my camera's axis ?
This camera has the core "look-controls" component bound to itself.
https://aframe.io/docs/1.2.0/components/look-controls.html
I read this on the official documentation and saw :
Rotates the entity when we rotate a VR head-mounted display (HMD).
Which seems to answer my problem, but doesn't work for me. I tried almost every properties for this component, like forcing / checking that the "hmdEnabled" property is set to true and I can't make this work.