2

I am new to SIP and RTP both. I have successfully managed to create a SIP call but I still dont have voice for the session.

I understand that I have to create a RTP stream and send packets. But I am unable to decide where to start from. I found JMf libraries (jar) but I am unable to understand how to use them. Also I want to play the Audio to the person I call during the transaction.

Do I have to start the RTP session inside the SIP INVITE or Do i have to create the RTP Session after the Call is answered Separately. I am not able to find answers to my Question.

Also I would like to know how do I create a RTP Session and I am doing simple Java Programming, I found a tutorial with JMf but with installation. I want to knoe if its possible with Simple Java Programming. I have the jmf-2.1.1e.jar file. I would like to know how to use it.

public SoundSenderDemo(boolean isLocal, int RTPsocket)  {
        DatagramSocket rtpSocket = null;
        DatagramSocket rtcpSocket = null;
        int socket = RTPsocket;


        try {
            rtpSocket = new DatagramSocket(socket);
            rtcpSocket = new DatagramSocket(socket+1);
        } catch (Exception e) {
            System.out.println("RTPSession failed to obtain port");
        }

        rtpSession = new RTPSession(rtpSocket, rtcpSocket);
        rtpSession.RTPSessionRegister(this,null, null);
        Participant p = new Participant("sip:username@password",socket,(socket + 1));
//      rtpSession.addParticipant(p);
        System.out.println("CNAME: " + rtpSession.CNAME());
        System.out.println("RTPSession: " + rtpSession.toString());
        System.out.println("Participant: " + rtpSession.getParticipants());
        System.out.println("unicast Receivers: " + rtpSession.getUnicastReceivers());

        this.local = isLocal;
    }

public void run() {
        if(RTPSession.rtpDebugLevel > 1) {
            System.out.println("-> Run()");
        } 
        File soundFile = new File(filename);
        if (!soundFile.exists()) {
            System.err.println("Wave file not found: " + filename);
            return;
        }

        AudioInputStream audioInputStream = null;
        try {
            audioInputStream = AudioSystem.getAudioInputStream(soundFile);
        } catch (UnsupportedAudioFileException e1) {
            e1.printStackTrace();
            return;
        } catch (IOException e1) {
            e1.printStackTrace();
            return;
        }

        //AudioFormat format = audioInputStream.getFormat();
        AudioFormat.Encoding encoding =  new AudioFormat.Encoding("PCM_SIGNED");
        AudioFormat format = new AudioFormat(encoding,((float) 8000.0), 16, 1, 2, ((float) 8000.0) ,false);
        System.out.println(format.toString());


        if(! this.local) {
            // To time the output correctly, we also play at the input:
            auline = null;
            DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);

            try {
                auline = (SourceDataLine) AudioSystem.getLine(info);
                auline.open(format);
            } catch (LineUnavailableException e) {
                e.printStackTrace();
                return;
            } catch (Exception e) {
                e.printStackTrace();
                return;
            }

            if (auline.isControlSupported(FloatControl.Type.PAN)) {
                FloatControl pan = (FloatControl) auline
                .getControl(FloatControl.Type.PAN);
                if (this.curPosition == Position.RIGHT)
                    pan.setValue(1.0f);
                else if (this.curPosition == Position.LEFT)
                    pan.setValue(-1.0f);
            }

            auline.start();
        }

        int nBytesRead = 0;
        byte[] abData = new byte[EXTERNAL_BUFFER_SIZE];
        long start = System.currentTimeMillis();
        try {
            while (nBytesRead != -1 && pktCount < 200) {
                nBytesRead = audioInputStream.read(abData, 0, abData.length);

                if (nBytesRead >= 0) {
                    rtpSession.sendData(abData);
                    //if(!this.local) { 
                    auline.write(abData, 0, abData.length);

                    //dataCount += abData.length;

                    //if(pktCount % 10 == 0) {
                    //  System.out.println("pktCount:" + pktCount + " dataCount:" + dataCount);
                    //
                    //  long test = 0;
                    //  for(int i=0; i<abData.length; i++) {
                    //      test += abData[i];
                    //  }
                    //  System.out.println(Long.toString(test));
                    //}

                    pktCount++;
                    //if(pktCount == 100) {
                    //  System.out.println("Time!!!!!!!!! " + Long.toString(System.currentTimeMillis()));
                    //}
                    //System.out.println("yep");
                }
                if(pktCount == 100) {
                    Enumeration<Participant> iter = this.rtpSession.getParticipants();
                    //System.out.println("iter " + iter.hasMoreElements());
                    Participant p = null;

                    while(iter.hasMoreElements()) {
                        p = iter.nextElement();

                        String name = "name";
                        byte[] nameBytes = name.getBytes();
                        String data= "abcd";
                        byte[] dataBytes = data.getBytes();


                        int ret = rtpSession.sendRTCPAppPacket(p.getSSRC(), 0, nameBytes, dataBytes);
                        System.out.println("!!!!!!!!!!!! ADDED APPLICATION SPECIFIC " + ret);
                        continue;
                    }
                    if(p == null)
                        System.out.println("No participant with SSRC available :(");
                }
            }
        } catch (IOException e) {
            e.printStackTrace();
            return;
        }
        System.out.println("Time: " + (System.currentTimeMillis() - start)/1000 + " s");

        try { Thread.sleep(200);} catch(Exception e) {}

        this.rtpSession.endSession();

        try { Thread.sleep(2000);} catch(Exception e) {}
        if(RTPSession.rtpDebugLevel > 1) {
            System.out.println("<- Run()");
        } 
    }

While sending ACK

dialog.sendAck(ackRequest);

//                  System.out.println(ackRequest.toString());
                logger.debug(ackRequest.toString());
                aDemo = new SoundSenderDemo(false, RTPsocket);
                RTPstart();


public void RTPstart(){
        // Start RTP Session
        String file = "C:/universAAL/workspaces/SIPfinaltest withRTP/SIPfinaltest/JSIP/garfield_converted.wav";

//      SoundSenderDemo aDemo = new SoundSenderDemo(false);

        aDemo.filename = args[0];
        aDemo.run();
        System.out.println("pktCount: " + aDemo.pktCount);
    }

Also in the invite I have set :

String sdpData = "v=0\n" + 
                    "o=user1 795808818 480847547 IN IP4 "+localIP+"\n" + 
                    "s=-\n" + 
                    "c=IN IP4 "+localIP+"\n" + 
                    "t=0 0\n" + 
                    "m=audio 8000 RTP/AVP 0 8 101\n" + 
                    "a=rtpmap:0 PCMU/8000\n" + 
                    "a=rtpmap:8 PCMA/8000\n" + 
                    "a=rtpmap:101 telephone-event/8000\n" + 
                    "a=sendrecv";
             byte[] contents = sdpData.getBytes();

This is the response:

SIP/2.0 200 OK
Via: SIP/2.0/UDP 10.99.134.149:5060;branch=z9hG4bK-333831-44ef6fc075d847c6420a0f95b2022345;received=10.99.134.149;rport=5060
From: <sip:top160_167@10.99.64.2>;tag=-1209613008
To: <sip:86940140@10.99.64.2>;tag=as12f64e9a
Call-ID: 5ac297147c47e8e20cc148dda4f350cf@10.99.134.149
CSeq: 5 INVITE
Server: Asterisk PBX 10.5.1
Allow: INVITE,ACK,CANCEL,OPTIONS,BYE,REFER,SUBSCRIBE,NOTIFY,INFO,PUBLISH
Supported: replaces,timer
Contact: <sip:86940140@10.99.64.2:5060>
Content-Type: application/sdp
Content-Length: 255

v=0
o=root 532626251 532626252 IN IP4 10.99.64.2
s=Asterisk PBX 10.5.1
c=IN IP4 10.99.64.2
t=0 0
m=audio 7758 RTP/AVP 8 101
a=rtpmap:8 PCMA/8000
a=rtpmap:101 telephone-event/8000
a=fmtp:101 0-16
a=silenceSupp:off - - - -
a=ptime:20
a=sendrecv
gourig
  • 81
  • 1
  • 11

1 Answers1

0

if you're interested in sip / rtp session establishement using simple java programming, check http://peers.sourceforge.net/. Take a look at the documentation page.

JMF is outdated. There are many java rtp stacks on the web, just google "java rtp stack".

When is the right time to start rtp session? It depends, sometimes you can start it only when you receive the 200 OK response, sometimes you have to create it earlier, when 183 response is received.

You can also create the rtp session when the invite is sent, and then update it with the right info when it's received (remote ip address, port, codec).

yohann.martineau
  • 1,523
  • 1
  • 17
  • 24
  • Hi, thank you for your reply. I have managed to create and a RTP session once Ack is sent. But the problem is the RTP session is started locally and the audio is being played on the local PC. I am not able to figure out where I can update the address so it plays on the Phone. I am pasting my code in the question. – gourig May 21 '15 at 13:43
  • The remote ip address and port number are in the 200 ok response in sdp msg body. in line c= here 10.99.64.2 you have remote ip address. And the port is in line m= here 7758. Make sure that the right codec is used. Here pcma. – yohann.martineau May 21 '15 at 18:06
  • I have used the IP and port from the reponse and the audio is a .wav file which is also PCM audio format with 8 bit sampling rate ,8000Hz freuquency . but still its not creating the RTP session. it plays the Audio on my laptop. – gourig May 21 '15 at 19:05
  • You cannot send a wav file and stream it as is over the network. There are headers in an wav file. You must remove them and maybe remove other things. Yiu also need headers in rtp packets, etc. You can extract raw packets from a wav file using audacity. 10.99.64.2 – yohann.martineau May 25 '15 at 15:17
  • Also, in the received SDP, the codec chosen by the remote side is PCMA, so you'd need to make sure the audio was encoded using A-law, not standard PCM. If you want to encode the audio ahead of time, you may not want to offer both µ-Law (payload type 0) and A-Law (payload type 8). Also, you've hard-coded your source port to 8000, is that intended or did you accidentally put the sampling rate there? – korvus May 25 '15 at 18:04
  • I am actually creating Audio using freetts and also when I was tracking my packets with Wireshark I realised that my RTP connection is itself not created. I need to fix it first I think. Is my RTP initialization correct? – gourig May 26 '15 at 14:44