I've been trying to have a live RTSP streaming of video and audio over a single stream.
What I did was very simillar to their example (creation of ServerMediaSession and adding two SubSessions to it, one for video and one for audio), the only change I've made is that I've made a new bytestream source called TcpSource, which is very simillar to their ByteStreamFileSource, only instead of "fread()", I call recv() in doReadFromFile method (video and audio have different sockets, off course).
The result is that if I have each subsession seperately, it works fine. However, when I try to stream both video and audio via TCP, it causes either major packet losses, or having only one stream working properly while the other stuck in the middle (e.g video is freezing while audio keeps playing fine).
Can you please advise? Does it has something to do with sending\receiving timeouts? Thanks in advance.