0

I'm communicating with an old piece of technology that uses binary serial communications. The protocol works on a synchronous master/slave basis with a request message sent and a response received.

I've used serial port communications quite a few times before however that is using ASCII data. The Serial Port readtimeout is observed using ReadLine on ASCII data, however using the Read command it does not seem to be observed, and instead the function returns whenever any data is received.

See a basic example below

    SerialPort comport;

    public void SetupSerial()
    {
        comport = new SerialPort("COM1");
        comport.BaudRate = 9600;
        comport.Parity = Parity.None;
        comport.DataBits = 8;
        comport.StopBits = StopBits.One;
        comport.DataReceived += Comport_DataReceived;
        comport.ReadTimeout = 5000;
        comport.Open();
    }

    private void Comport_DataReceived(object sender, SerialDataReceivedEventArgs e)
    {
        byte[] buffer = new byte[1024];

        int bytesRead = 0;
        try
        {
            bytesRead = comport.Read(buffer, 0, 6);
        }
        catch(TimeoutException tex)
        {
            Console.WriteLine("Timeout Exception!");
        }

        Console.WriteLine(bytesRead.ToString() + " Bytes Read!");

    }

Now as i understand it, what should happen is if i do:

comport.Read(buffer,0,6)

It should wait for 6 bytes or throw a TimeoutException should that not happen in the timeout period. But this does not seem to be happening, i only get a TimeoutException if no data is received. The function returns whenever it reads any data so sometimes i receive 6 bytes, sometimes 1 etc etc..

Now if i were always expecting 6 bytes, i can just do

if(comport.BytesToRead < 6) return;

But the messages i receive can be variable length depending on the first 2 bytes contents.

Matt Inglis
  • 49
  • 1
  • 5
  • What you experience is normal behaviour. Documentation says: TimeoutException -> "No bytes were available to read.". The read method will read the bytes available in the buffer and then return. The length parameter defines the maximum number of bytes to read (the size of your buffer). See [What is the correct way to read a serial port using .NET framework?](https://stackoverflow.com/questions/13754694/what-is-the-correct-way-to-read-a-serial-port-using-net-framework) for some ways to solve this. – H.G. Sandhagen Apr 11 '19 at 18:33
  • Maybe this [answer](https://stackoverflow.com/a/11340682/10927863) can help – Eliahu Aaron Apr 11 '19 at 18:41
  • 1
    How about testing comport.BytesToRead – Eliahu Aaron Apr 11 '19 at 18:45
  • Thanks for the responses, comport.DiscardInputBuffer is evil in this case, as you would loose part of the message should it be stuck in the buffer. All the examples posted are just standard implementations of serialport, not sure how they are relevant. I guess im going to have to implement my own buffer to overcome the problem due to the fact the responses i am expecting are variable length. – Matt Inglis Apr 12 '19 at 09:34

0 Answers0