4

I have a voip application I am working on using the OPAL voip SIP stack.

I am overriding a class called OpalLocalEndpoint and reading/writing encoded data to and from my gstreamer pipelines. For reading, I grab the rtp payloaded data from an appsink, and for writing I push the payloaded data to the appsrc.

I captured the SDP files from wireshark.

Here is a client to the App.

v=0
o=- 1319058426 1 IN IP4 192.168.0.71
s=Opal SIP Session
c=IN IP4 192.168.0.71
t=0 0
m=audio 5086 RTP/AVP 125 0 8 124 101
a=sendrecv
a=rtpmap:125 Speex/16000/1
a=fmtp:125 sr=16000,mode=any
a=rtpmap:0 PCMU/8000/1
a=rtpmap:8 PCMA/8000/1
a=rtpmap:124 Speex/8000/1
a=fmtp:124 sr=8000,mode=any
a=rtpmap:101 telephone-event/8000
a=fmtp:101 0-16,32,36
m=video 5088 RTP/AVP 109 108 34 114
b=AS:4096
b=TIAS:4096000
a=sendrecv
a=rtpmap:109 h264/90000
a=fmtp:109 packetization-mode=1;profile-level-id=42C01E
a=rtpmap:108 h263-1998/90000
a=fmtp:108 D=1;F=1;I=1;J=1;CIF=1;CIF4=1;QCIF=1;CUSTOM=320,240,1;CUSTOM=640,480,1
a=rtpmap:34 h263/90000
a=fmtp:34 F=1;CIF=1;CIF4=1;QCIF=1
a=rtpmap:114 MP4V-ES/90000
a=fmtp:114 profile-level-id=5

Here is the server replying back to the client

v=0
o=- 1319058099 1 IN IP4 192.168.0.215
s=HHP Video Codec/1.0
c=IN IP4 192.168.0.215
t=0 0
m=audio 5006 RTP/AVP 125 0 8 124
a=inactive
a=rtpmap:125 Speex/16000/1
a=rtpmap:0 PCMU/8000/1
a=rtpmap:8 PCMA/8000/1
a=rtpmap:124 Speex/8000/1
a=maxptime:20
m=video 5004 RTP/AVP 109
b=AS:2048
b=TIAS:2048000
a=sendrecv
a=rtpmap:109 h264/90000
a=fmtp:109 packetization-mode=1;profile-level-id=42c01e

I encode the data with:

 v4l2src name=videoSrc ! video/x-raw-yuv, format=(fourcc)I420, width=352, height=288, framerate=(fraction)30/1 ! videobalance name=VideoBalance ! textoverlay name=chanNameFilter ! textoverlay name=osdMessageFilter ! textoverlay name=sessionTimerOverlay ! x264enc byte-stream=true bframes=0 b-adapt=0 tune=0x4 speed-preset=3 bitrate=256 sliced-threads=false profile=0 ! rtph264pay mtu=1412 ! appsink name=videoAppSink sync=false

And attempt to decode the incoming data with

appsrc is-live=true do-timestamp=false typefind=true name=videoAppSrc ! application/x-rtp, media=video, payload=109, clock-rate=90000, encoding-type=H264, byte-stream=true, access-unit=true ! rtph264depay ! ffdec_h264 !  xvimagesink name=videoOutputSink

However, while the encoded data shows up on the client (it didn't at first, I had to add all of those properties until it finally showed up correctly), I have not been able to get the decoding end to work.

It shows a mostly gray screen with blips of pink, yellow, and green. Sometimes I get a little more of the right colors and most of the time just gray.

If I use this same exact pipeline to interact with VLC it works fine. My guess is that I am goofing up the caps somewhere. Can anyone offer any ideas on what I should be looking for?

I am having the same troubles with each of my other encoders as well, i.e. theora, h263 etc... though each in a different way.

blahdiblah
  • 33,069
  • 21
  • 98
  • 152
Jonathan Henson
  • 8,076
  • 3
  • 28
  • 52
  • Sounds exciting, I'm not even sure I understand all the details :-) I have two ideas though you could try as experiments: In your sip application, try to send through some test pattern, like all 10101010 bytes, and observe on the other end. – Szocske Oct 20 '11 at 21:01
  • Second, try your media generator and sink with some simple network transporter, like netcat, and see if they work. – Szocske Oct 20 '11 at 21:02
  • @Szocske, I can verify that the data makes it to the other end fine via wireshark. I think maybe the problem is in the rtp packet fragmentation? Also, I am curious if the problem lies in some incompatibility between libavcodec and ffmpeg's ffdec_h264. Also, I don't really have access to the network transport layer of the sip app as that is managed by opal. If I had access, I would turn it off and use gstrtpbin with udpsink /udpsrc and be done with it. – Jonathan Henson Oct 20 '11 at 22:00
  • wireshark only shows you the data gets half way, in experiment 1 you want to prove the data is intact after extracting from RTP, and before passing to the media app. – Szocske Oct 21 '11 at 11:54
  • @Szocske The transport layer of OPAL handles grabbing the rtp packet over the socket, it then passes to me what I assume is an untouched packet which I use Gstreamer to depayload. I don't know how I can perform this test because in either scenario, I only have access to one end. I can send a test pattern. I will try that in the morning by using gstreamer's videotestsrc. H.264 is a little tricky, I can't just send an arbitrary pattern as the payloader needs certain data from the encoder. – Jonathan Henson Oct 24 '11 at 03:45
  • @Szocske I confirmed the problem is indeed the transport layer of OPAL. I bypassed the part of OPAL that sends the RTP packets and sent them myself via udpsink and received them on udpsrc and it worked. So, the problem is not related to the actual streaming. – Jonathan Henson Oct 24 '11 at 18:39

1 Answers1

1

It turns out the VOIP stack, which is otherwise an excellent stack, either has a bug or I do not understand the way it packages and transmits the RTP packets. I bypassed it and sent the data via gstreamer udpsink and udpsrc and it works fine. Now my only remaining questions will be directed to the dev team of the stack. Thanks for your help.

Jonathan Henson
  • 8,076
  • 3
  • 28
  • 52
  • Can you help me out. I have two gstreamer pipelines that transmit voice using IP and UDP port and i want to automate the transmission in a telephonic manner i.e one Linux calls the other one and after acknowledgement the pipelines on both sides start the voice exchange? P.s i asked this here because i though it is relevant to this thread – fer y Nov 29 '13 at 12:21
  • i have created a new question for this matter.please check it out: http://stackoverflow.com/questions/20329685/signaling-a-wi-fi-head-set – fer y Dec 02 '13 at 15:41