0

Im sending a large string 0.443+0.064+-0.120+-0.886+0.15167+-0.26754+0.95153 over a TCP socket-connection.

The message i recieve is not similar to the string i send. It is cut at random points, i.e. 43+0.064+-0.120+-0.886+0.15167+-0.26754+0

How can i make sure the full string is read?

This is the clientcode:

public static void SendMessage(string message)
{
   if (socketConnection == null)
      {
        return;
      }
   using (BinaryWriter writer = new
   BinaryWriter(socketConnection.GetStream(), Encoding.ASCII, true))
      {
         writer.Flush();
         writer.Write(message);
         writer.Flush();
       }
}

This is my servercode:

private void ListenForIncommingRequests()
{
     tcpListener = new TcpListener(IPAddress.Parse("127.0.0.1"), 8080);
     tcpListener.Start();
     connectedTcpClient = tcpListener.AcceptTcpClient();

     using (BinaryReader reader = new 
     BinaryReader(connectedTcpClient.GetStream()))
     {                   
         while (true)
          {
            string clientMessage = reader.ReadString();
          }
     }
}
pvand
  • 103
  • 1
  • 10
  • 1
    Is "0.443+0.064+-0.120+-0.886+0.15167+-0.26754+0.95153" your entire string? Because if so, it's not very large. But if it is larger it could be a fragmentation issue. Try declaring your `string clientMessage` outside the `while (true)` scope and adding to the string like so: `clientMessage += reader.ReadString();` – MindSwipe Feb 26 '19 at 12:59
  • Why are you using Port 8080? That's usually used by http servers. – NineBerry Feb 26 '19 at 13:03
  • @MindSwipe BinaryReader and BinaryWriter use a format where the string is preceded by a prefix which contains the length of the string. So, there cannot be any fragmentation. – NineBerry Feb 26 '19 at 13:05
  • You are using different encodings for writing (ASCII) and reading (UTF-8) which can cause problems for longer strings. Have you tried using the same encoding at both places? – NineBerry Feb 26 '19 at 13:07
  • Please provide a full [mcve] that can be used to reproduce the effect. – NineBerry Feb 26 '19 at 13:09
  • Try setting the Encoding to ASCII as well when reading, so change `BinaryReader(connectedTcpClient.GetStream()))` to `BinaryReader(connectedTcpClient.GetStream(), Encoding.ASCII))`. If this doesn't work you may need to try `Encoding.Unicode` on both ends (as C# uses Unicode (UTF-16) by default) – MindSwipe Feb 26 '19 at 13:12
  • @MindSwipe, that was it! I changed the encoding, it works now – pvand Feb 26 '19 at 13:57
  • @NineBerry, thanks! i was using different encodings... And did not know about port 8080 will use another, thanks! – pvand Feb 26 '19 at 13:58

1 Answers1

2

As @NineBerry pointed out in the comments, you're writing ASCII encoded Bytes, but reading default (Unicode (UTF-16)) Encoded Bytes. Make sure to use the same Encoding on both ends, I'd recommend using Unicode, so either remove Encoding.ASCII when instantiating your BinaryWriter or use Encoding.Unicode when instantiating your BinaryWriter AND your BinaryReader

NineBerry
  • 26,306
  • 3
  • 62
  • 93
MindSwipe
  • 7,193
  • 24
  • 47
  • 1
    I think you initially meant UTF-8 instead of UTF-16. And, for files and streams, UTF-8 would be the more common choice. – Tom Blodget Feb 26 '19 at 17:29