My issue is that when i'm streaming a continuous stream of data over a LOCAL LAN network sometimes random bytes gets lost in the process.
As it is right now the code is set up to stream about 1027 bytes or so ~40 times a second over a lan and sometimes (very rare) one or more of the bytes are lost.
The thing that baffles me is that the actual byte isn't "lost" it is just set to 0 regardless of the original data. (I'm using TCP by the way)
Here's the sending code:
public void Send(byte[] data)
{
if (!server)
{
if (CheckConnection(serv))
{
serv.Send(BitConverter.GetBytes(data.Length));
serv.Receive(new byte[1]);
serv.Send(data);
serv.Receive(new byte[1]);
}
}
}
and the receiving code:
public byte[] Receive()
{
if (!server)
{
if (CheckConnection(serv))
{
byte[] TMP = new byte[4];
serv.Receive(TMP);
TMP = new byte[BitConverter.ToInt32(TMP, 0)];
serv.Send(new byte[1]);
serv.Receive(TMP);
serv.Send(new byte[1]);
return TMP;
}
else return null;
}
else return null;
}
The sending and receiving of the empty bytes are just to keep the system in sync sorta. Personally i think that the problem lies on the receiving side of the system. haven't been able to prove that jet though.