0

This is really an initial fact-finding question: In the past we have been using Zoom to facilitate our audio/video meetings (which are effectively teacher: 1 student meetups). Now we have developed our own in-house JavaScript client to do the same. We have a prototype that we have just tested. In this test (1 teacher and 1 student session) all seemed to go very well initially. However we noted that a) there appears to be more lag in the reception of audio and video tracks between sender and receiver than in the Zoom app, b) at short intervals (5-10 minutes session duration) one of the parties crashes, requiring another initiation of the audio/video session (getUserMedia and then RTCPeerConnection protocol funcs and onicecandidate, onnegotiationneeded and ontrack callbacks feeding into our signalling server). Clearly, since this is at prototype stage there will be errors requiring fixing, but I was wondering if an app such as Zoom actually uses the same webRTC protocols and funcs as us, or if it is going through perhaps its own in-house libraries to maybe get around some of the quite complex issues involved with pre-transmission issues such as codecs and compression of multi-media data, and then actually handle the udp transmission of the data packets. Many thanks in advance to anyone with some ideas on this!

Ivan: Many thanks for this - it was useful to get some background on this subject. We are aware of webassembly and employed it three years ago to compress .wav to .mp3 (although using a pre-built wasm module).

Ivan: many thanks for this. Yes, we know about webassembly – actually we did some work two years ago compressing .wav to .mp3 format using a pre-built wasm module. Mobile devices don't apply for our main app. What you say about about restricted networks is useful too, but what we think is the main issue here is the network speed. We did a quick check with all users and in the main they divide up into two groups, giving a network speed in the order of 3-14 Mbps and 100-120 kbps. In the main our users are students working at home behind either a Wifi router or they are connecting to a mobile phone. The Wifi router appears to consistenty give the Mps range while those connected to mobile phones can equal the Mps figure but depending on time of day or proximity to a data signal can drop to 100-120 kbps or even lower. We suspect that what is happening is that those having problems sending video/audio are those when the available bandwidth drops below 100 kbps (we find that in this case the sender can still see and hear the received video but that their counterparty starts complaining about missing video frames - black screen and then no audio). Interestingly enough the Zoom app appears to handle the degraded case better, but we really don't know why. This prompts me to ask whether webRTC requires a minimum bandwidth and how it calculates the bitrate of the sender/receiver - I suppose that calculating the latter would depend on the former as well as taking into account the intrinsic properties of the webcam. Is there a way of doing some sort of configuring on-the-fly taking into account network conditions for this degraded case?

The reason for asking this question is to try to maintain a video chat during these degraded situations (albeit at reduced bitrate/resolution/ framerate).

luzaranza
  • 49
  • 4

1 Answers1

2

If you wonder about Zoom web specifically - yes, they use webrtc, but only as a transport option, not including media-related stuff. Their encoding / decoding is based on WebAssembly and they draw video frames on canvas. See this article for more details.

But i guess your question is more like "can i make a production-ready conferencing using only webrtc APIs?"

I believe you can, but there are some limitations.

  1. WebRTC is a bit unpredictable on mobile devices, especially on iOS. Take a look on this article for details. I personally had a very bad time last year with webrtc calls on iOS and ended up writing a native client. Hope it will get better over time.
  2. In some restricted networks webrtc traffic might be blocked. It's probably the reason Zoom has a fallback on websockets.

If you intend to record the conferences or to have a lot of participants, a media-server might be needed. I recommend you taking a look at some opensource webrtc-based conferencing apps to find out if it suits your needs:

Ivan
  • 624
  • 4
  • 7