I have a job where I have to take continuous screenshots and capture sound from a desktop, then publish them as a live video stream. I use Wowza Media Server 3.0.3 for stream publishing. I also use Xuggler to generate image frames and put them with sound buffers into packets. I have the following problem:
I start my program, and the publishing of image frames and sound packets are in progress. Wowza console informs me, that packets are published. When I open a media player (in this case VLC), the video part of the stream works like a charm (I see the imageframes captured from my desktop continuously), but the audio part is very poor. I mean, when I start to play the livestream, the VLC buffers an approximately 3 second long sound part recorded from my desktop, and plays it back at a higher speed. After a longer break it buffers again and play the next part. In my code I continuously send sound iBuffers encoded in MP3 and publish them into packets, so I cannot understand why the sound is not playing continuously as the image frames.
Can anyone got the answer or any experience in my problem?
I've made a copy from my code, where I just stream desktop sound, not with the image frames. This is the snippet, where I get the sound, and send it to encode and publish:
while (true)
{
byte buffer[] = new byte[line.available()];
int count = line.read(buffer, 0, buffer.length);
IBuffer iBuf = IBuffer.make(null, buffer, 0, count);
//Itt írjuk a stream-be az audioframe-et
_AudioWriter.encodeFrameToStream(iBuf, buffer, firstTimeStamp);
try {
Thread.sleep(100);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
This is the part, where I get the iBuffer and encode it to mp3. After I publish it as a packet:
public void encodeFrameToStream(IBuffer ibuffer, byte[] buffer, long firstTimeStamp) {
long now = System.currentTimeMillis();
long timeStamp = (now - firstTimeStamp);
IAudioSamples outChunk = IAudioSamples.make(ibuffer, 1, IAudioSamples.Format.FMT_S16);
if (outChunk == null)
{
return;
}
long numSample = buffer.length / outChunk.getSampleSize();
outChunk.setComplete(true, numSample, 44100, 1, Format.FMT_S16, timeStamp);
//System.out.println(outChunk + " =========== " + outChunk.getPts());
IPacket packet2 = IPacket.make();
packet2.setStreamIndex(0);
getCoder2().encodeAudio(packet2, outChunk, 0);
outChunk.delete();
if (packet2.isComplete()) {
//System.out.println("completed");
getContainer().writePacket(packet2);
//System.out.println("Size: "+packet2.getSize());
}
}