I’m working on a client and server protocols over TCP/IP. It is completely asynchronous – that is it uses TcpClient
’s methods ConnectAsync/Close/ReadAsync/WriteAsync
to connect/disconnect/read/write respectively.
My GUI has the Connect and Disconnect buttons. The Connect button disconnects and then immediately connects:
Disconnect();
Connect();
The Disconnect button only disconnects:
Disconnect();
Scenario 1: press Connect. Scenario 2: press Disconnect and then press Connect. The problem I’m facing is that Scenario 1 duly disconnects but only connects after that if I add a delay of about 100 ms:
Disconnect();
Thread.Sleep(100);
Connect();
The second scenario always works, which is of course expected, since the “manual” delay would be much longer than 100ms.
The connection code calls TcpClient.ConnectAsync()
, and waits for the task’s completion.
The task completes normally; I also check 3 other things after its completion:
Debug.Assert(TcpClient.Connected);
Debug.Assert(TcpClient.Client.Connected);
Debug.Assert(IsConnected(TcpClient.Client));
…
bool IsConnected(TcpClient.Client) // fairly well-known way to check a connection
{
bool blockingState = client.Blocking;
try
{
var bytes = new byte[1];
client.Blocking = false;
var bytesSent = client.Send(bytes, 0, 0);
return true;
}
catch (SocketException e)
{
// 10035 == WSAEWOULDBLOCK
if (e.NativeErrorCode.Equals(10035))
return true;
else
{
return false;
}
}
finally
{
client.Blocking = blockingState;
}
}
However the server listening for connections does not “hear” the client’s connection attempt: evidently, it needs some time to “recuperate” from the disconnection. Now my question: is the above situation normal and I indeed should introduce some delay between disconnecting and connecting?