0

I am developing an Android application that uses google's ARCore in Xamarin. So far I have managed to do this:

  1. I created a GlSurfaceView and I am able to see the output of ARCore (rendered by opengl) in it.
  2. I am using OpenVidu as WebRtc media server, so I managed to port Java library that uses google.webrtc to create a webrtc connection, and I am able to stream my front camera content to the server, and also to screencast my android screen to the server.

So I am able to cast the entire device screen to the server, but the problem is that this includes every notification too, and also anything that the device will display: messages, calls...). What I want is to cast only the GlSurfaceView to the server and I don't know how.

To display the ArCore output on the GlSurfaceView I use openGl in OnDrawFrame(IGL10) method of the Activity.

To screencast I use MediaProjection and I add a new VideoSource to the PeerConncetion's VideoTracks.

How can I add the output of ARCore (buffer or anything) as the VideoSource of the PeerConnection?

Or any other better way of doing it?

Thanks

Luka
  • 4,075
  • 3
  • 35
  • 61

0 Answers0