I'm creating a kind of live video streaming application and using a number of different libraries. I'm using NAudio for unpacking the audio stream as it comes in. I found on their discussion boards this thread which I utilised like so;
BufferedWaveProvider mybufferwp = null;
WaveOut wo = new WaveOut();
WaveFormat wf = new WaveFormat(16000, 1);
void MainWindow()
{
_audioClient = new AudioClient();
_audioClient.AudioFrameReady += _audioClient_AudioFrameReady;
_audioClient.Connect(parent.TempIp, parent.AudioPort);
mybufferwp = new BufferedWaveProvider(wf);
mybufferwp.BufferDuration = TimeSpan.FromMinutes(5);
wo.Init(mybufferwp);
wo.Play();
}
void _audioClient_AudioFrameReady(object sender, AudioFrameReadyEventArgs e)
{
if (mybufferwp != null)
{
mybufferwp.AddSamples(e.AudioFrame.AudioData, 0, e.AudioFrame.AudioData.Length);
}
}
My problem is that the audio is slightly delayed. Not by much granted but its noticeable and I was hoping there might be something I could do to get it more in sync with my video feed which is nearly perfectly live.
Extra Info AudioClient is from Kinect Service which allows me to send and receive Kinect Camera Data.