I'm communicating with an old piece of technology that uses binary serial communications. The protocol works on a synchronous master/slave basis with a request message sent and a response received.
I've used serial port communications quite a few times before however that is using ASCII data. The Serial Port readtimeout is observed using ReadLine on ASCII data, however using the Read command it does not seem to be observed, and instead the function returns whenever any data is received.
See a basic example below
SerialPort comport;
public void SetupSerial()
{
comport = new SerialPort("COM1");
comport.BaudRate = 9600;
comport.Parity = Parity.None;
comport.DataBits = 8;
comport.StopBits = StopBits.One;
comport.DataReceived += Comport_DataReceived;
comport.ReadTimeout = 5000;
comport.Open();
}
private void Comport_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
byte[] buffer = new byte[1024];
int bytesRead = 0;
try
{
bytesRead = comport.Read(buffer, 0, 6);
}
catch(TimeoutException tex)
{
Console.WriteLine("Timeout Exception!");
}
Console.WriteLine(bytesRead.ToString() + " Bytes Read!");
}
Now as i understand it, what should happen is if i do:
comport.Read(buffer,0,6)
It should wait for 6 bytes or throw a TimeoutException should that not happen in the timeout period. But this does not seem to be happening, i only get a TimeoutException if no data is received. The function returns whenever it reads any data so sometimes i receive 6 bytes, sometimes 1 etc etc..
Now if i were always expecting 6 bytes, i can just do
if(comport.BytesToRead < 6) return;
But the messages i receive can be variable length depending on the first 2 bytes contents.