I'm writing a bit of code to connect to a Bluetooth device. For the purpose of this question, the device can be considered to receive any number of bytes, buffer them and respond with 0x06 on a success or 0x15 on a failure.
The problem I am having is in receiving these return bytes.
I establish a connection to the device using the 32feet library's BluetoothClient object. I then open a NetworkStream to communicate with the device and begin sequentially writing bytes and then reading the response.
public int Upload(NetworkStream stream, List<string> hexLines) {
int message = 0;
byte data[] = null;
try {
for (int i = 0; i < hexLines.Count; i++) {
data = Encoding.ASCII.GetBytes(hexLines[i] + "\r");
stream.Write(data, 0, data.Length);
message = stream.ReadByte();
switch(message) {
//Return something depending on response
}
}
catch {
//Do some error handling
}
finally {
//Tidy up
}
}
What I expect to happen is for ReadByte() to return one of 0x06 or 0x15 and nothing else. What I infact observe happening is that often 0x11 and 0x13 are returned. Given that the Bluetooth device sends no other data, and I am only reading a single byte, I am confused about where these unexpected bytes are coming from.
I have found that adding a short Thread.Sleep(x) between the write and read results in consistently reading only 0x06 or 0x15 as expected, but as this is a Bluetooth application I don't necessarily know the minimum time I can wait and don't want to artificially slow down the application.
What might be the cause of these extra bytes on the NetworkStream? Is there a more robust way to avoid the issue than Thread.Sleep()?
Thanks for any help.