2

I want to ask an experimental kind of question. So, I have a WebRTC stream, that needs to be player by XBMC/Kodi. I was planning this and I think the biggest problem is converting and sending the stream (note that this was without any prototyping code). This is the plan:

  1. Get stream (let's ignore this)
  2. Send the stream to Node.JS through a WebSocket (shouldn't be that hard, as long as that is possible, because I am not sure)
  3. Receive stream through WebSockets in Node.JS
  4. Convert the stream
  5. Send it as something acceptable by XBMC/Kodi (say RTP)

The last 2 bits are the hardest and I have no idea how to do that. Could someone help me?

JSFTW
  • 21
  • 1
  • 2
  • Number 2 and 3 makes no sense, WebRTC connects directly browser to browser, you can't send it through websockets ? – adeneo Dec 28 '14 at 16:35
  • @adeneo I did write I don't know for sure if I can send it through websockets. But the answer [here](http://stackoverflow.com/questions/16655544/webrtc-videochat-through-websockets) says that that project with node.js uses websockets. Perhaps there are more ways – JSFTW Dec 28 '14 at 16:42
  • websockets for **signaling**, you need some way to authenticate users and send a signal to the client so the WebRTC stream can start, the stream itself would be sent directly. – adeneo Dec 28 '14 at 17:28
  • @adeneo I don't understand. The computer (1st client) sends the stream to the server and then the server opens another websocket to XBMC/Kodi with the stream url. At least that's how I imagine it. How would the stream be sent directly? – JSFTW Dec 28 '14 at 17:30
  • WebRTC doesn't go through the server, or at least it shouldn't, that's the point, it's p2p and connects one users browser directly to another users browser so the traffic doesn't have to go through your server at all, you just have to send some signals to get things started, and that's where sockets come in. – adeneo Dec 28 '14 at 17:40
  • @adeneo But the problem with that is that XBMC/Kodi needs an URL with a stream it will support, WebRTC has neither of those things – JSFTW Dec 28 '14 at 17:55
  • I don't know how Kodi works, or how you'd convert the stream etc. It could probably be routed through the server somehow, but that sorta defeats the purpose of the p2p thingy ? – adeneo Dec 28 '14 at 18:02
  • @adeneo So what do I do then? Maybe it's still somehow possible to send to the server? – JSFTW Dec 28 '14 at 18:03

1 Answers1

0

As much as it is said about WebRTC being peer to peer, there are some lesser known facts. Peer to peer is not always possible due to the inconsistency of the internet architecture, most prominently, the symmetric NATs (Network address translators), which is normally the case with mobile networks and some 'badly behaved' networks.

For most other networks, with WebRTC, you do not need to send data through your server, as it will connect peer to peer by using STUN protocol to know public socket details and hole punching to actually transmit data. You need the server to setup the signalling part as it is not a part of WebRTC. For signalling, you may use protocols like SIP, websockets etc.

Having said that, as a failsafe mechanism when p2p is not possible, the approach you may take is to route the traffic through your server. The good thing is that WebRTC provides the support for this approach is using TURN servers. ICE is used to identify the best case scenario ( p2p possibility by using STUN or routing data through TURN server ). The thing to note is that the latter is not peer to peer and to route the data through a TURN server requires a high bandwidth TURN server incurring exorbitant costs.

Now, let me address some incorrect assumptions in your points:

1. You can handle it as you have stated.

2. This step will be accomplished by the TURN server. Internally, websockets is not used by WebRTC. It uses SRTP ( RTP over SSL ) at application layer and TCP or UDP ( depending on the firewall traversal and reliability requirements ). So, websockets is not possible with WebRTC. That is a completely different approach.

3. Same as point 2.

4. No on-the-fly conversion is normally done or recommended ( the lag in conversion will take away the Real time feature out the window ). Any such conversion should be done in step 1.

Before the session is initiated, the SDP (session description protocol) relays the codecs for audio and video to both clients in the signalling phase.

5. Once again, the thing to note is that after the session is initialized, whether p2p or through TURN server, the data should flow uninterrupted to both the clients. It's the essence of WebRTC.

If you want something else, try websockets. It is doesn't need anything more than websocket support at the client and server side. It uses all the architecture of TCP-IP-HTTP protocol stack except that it replaces HTTP with websockets at the application layer with an upgrade request to the server. This allows bidirectional flow of data from server and clients and you are more free to do computations on the data.


There is a likely case where you may use websockets for signalling before initiating the WebRTC session between clients.

P.S. Due to less reputation, I can't post more than 2 links. Please use wikipedia for reference to the terms unclear to you.

GulshanZealous
  • 638
  • 8
  • 9
  • Please read [How do I format my posts using Markdown or HTML?](http://stackoverflow.com/help/formatting). – buhtz Dec 18 '16 at 13:00
  • 1
    As the answer to the question doesn't require code, it is unclear to me if how better may I format it. However, I have tried my best and to me, it seems more readable now. – GulshanZealous Dec 18 '16 at 13:25