I am trying to build an application like twitch (i.e. many-to-many real-time video streams). I want to use WebRTC because I want to make the app accessible from all platforms (I am planning to either go with Nativescript or the PWA road). My plan is to stream the camera from person A to the media server. Transcode the WebRTC stream in multiple qualities, etc. and send it to all of the subscribed users, who are also able to play WebRTC streams. In the ideal case, there will be thousands of streamers, each with thousands of real-time subscribers.
How can this be done, however? I need some kind of media server which will be responsible to take the stream, transcode it and forward it. An MVP would be to just forward the stream, without transcoding it, however, it should be possible to add that optimization in the future.
Should I go for something like Kurento, Jitsi, etc? Or is it feasible that I can build this server myself?
Is this architecture even a good idea, or should I rethink everything? The reason I am not going for RTMP or something similar is because of the amount of code and work that has to be put into developing the different client codes for native apps (iOS, Android, any ol' browser). If I can use WebRTC, this will make the client code much easier and will make the app accessible on all platforms.
Thanks a lot in advance!