1

I have a C# desktop app. It connects to another PC on my network which is a UWP C# app.

I am trying to send an image or 2 to my listening socket and to test this I get the listening socket to send me the image back.

The trouble is that even though my server recieves all the bytes that were orginally sent the recieved image back to the client is not of the same size.

To make this even more weird is sometimes the returned bytes are correct and I get the whole image and when I attempt to send 2 images the 1st one is OK and the 2nd one is not.

Then it will/can revert back to no images being sent back correctly.

I think is maybe to do with the async/await parts bit I am not sure how.

This is my server code:

using (IInputStream input = args.Socket.InputStream)
{
    byte[] data = new byte[BufferSize];
    IBuffer buffer = data.AsBuffer();
    uint dataRead = BufferSize;
    while (dataRead == BufferSize)
    {
        await input.ReadAsync(buffer, BufferSize, InputStreamOptions.Partial);
        requestInBytes.AddRange(data.Take((int) buffer.Length));
        dataRead = buffer.Length;
    }
}
var ct = requestInBytes.Count;

I then trip out the header info:

int counter = 0;
counter = requestCommand[0].Length;
counter = counter + requestCommand[1].Length;
counter = counter + requestCommand[2].Length;
counter = counter + requestCommand[3].Length;
counter = counter + requestCommand[4].Length;
counter = counter + requestCommand[5].Length;
counter = counter + 6;

Now I extract the image:

var imgBody = new byte[totalBytes.Length- counter];
System.Buffer.BlockCopy(totalBytes, counter, imgBody, 0, imgBody.Length);
byteArray = imgBody;

And send just the image back:

using (IOutputStream output = args.Socket.OutputStream)
{
    using (Stream response = output.AsStreamForWrite())
    {
        MemoryStream stream = new MemoryStream(byteArray);
        await response.WriteAsync(byteArray, 0, byteArray.Length);
        await response.FlushAsync();
    }
}

This is my client code:

StringBuilder sb = new StringBuilder();
foreach (var gallery in Shared.CurrentJobGallery)
{
    try
    {
        sb.Clear();
        sb.Append(GeneralTags.ACTION_ADD);
        sb.Append(Shared.DLE);
        sb.Append("GALLERY");
        sb.Append(Shared.DLE);
        sb.Append(Shared.CurrentClientId);
        sb.Append(Shared.DLE);
        sb.Append(gallery.Title);
        sb.Append(Shared.DLE);
        sb.Append(gallery.Description);
        sb.Append(Shared.DLE);
        sb.Append(jobRef);
        sb.Append(Shared.DLE);
        byte[] galleryHdr = Encoding.UTF8.GetBytes(sb.ToString());
        byte[] byteArray = new byte[galleryHdr.Length + gallery.ImageData.Length];

        Buffer.BlockCopy(galleryHdr, 0, byteArray, 0, galleryHdr.Length);
        Buffer.BlockCopy(gallery.ImageData, 0, byteArray, galleryHdr.Length, gallery.ImageData.Length);
        List<byte> requestInBytes2 = new List<byte>();
        System.Diagnostics.Debug.WriteLine("SENT: " + gallery.ImageData.Length.ToString());
        using (TcpClient clientSocket = new TcpClient())
        {
            await clientSocket.ConnectAsync(GeneralTags.RASPBERRY_PI_IP_ADDRESS, GeneralTags.RASPBERRY_PI_PORT);
            using (NetworkStream serverStream = clientSocket.GetStream())
            {
                List<byte> requestInBytes = new List<byte>();

                serverStream.Write(byteArray, 0, byteArray.Length);
                serverStream.Flush();
                int i;
                Byte[] bytes = new Byte[1024];
                do
                {
                    i = serverStream.Read(bytes, 0, bytes.Length);
                    byte[] receivedBuffer = new byte[i];
                    Array.Copy(bytes, receivedBuffer, i);

                    requestInBytes2.AddRange(receivedBuffer);
                } while (serverStream.DataAvailable);

            }
        }

        using (MemoryStream ms = new MemoryStream())
        {
            System.Diagnostics.Debug.WriteLine("BACK: " + requestInBytes2.Count.ToString());
            ms.Write(requestInBytes2.ToArray(), 0, requestInBytes2.ToArray().Length);
            Shared.ViewImage(Image.FromStream(ms, true));
        }
    }
    catch (Exception ex)
    {
        System.Diagnostics.Debug.WriteLine(ex.ToString());
    }
}
Andrew Simpson
  • 6,883
  • 11
  • 79
  • 179
  • Do not just read and send image. Use some protocol for doing it, like RTP. This way you will be able to track and record packets sent. You will able to find if there is any packet loss over network. –  Nov 21 '15 at 13:05
  • @dgate I am sure you are right but do you have any code examples in the C# context? thanks – Andrew Simpson Nov 21 '15 at 13:07
  • Well I am no C# programmer, but you can look at RFC 3550. Or else you can add a sequence number before every packet and check on receiver side whether client received all packets or not. –  Nov 21 '15 at 13:10
  • @dgate I did have a brief search but there is not a lot out there. However, can you see anything wrong with my code and a suggestion to improve it? I normally use sockets to stream images and never had an issue. normally I would use threading. But the latest thing seems to be to use await and async and If I want to stay current I need to use these things. I am suspecting this is why my code does not work. I do not believe there is a strong case (in this example) to switch protocols as TCP and even UDP have been reliant before. :) – Andrew Simpson Nov 21 '15 at 13:14
  • I'm unfamiliar with C# programming. But I can see you are sending a whole image in one shot. Instead of that find out your MTU and determine MAX packet size. Then read & send MAX_SIZE packet send with sequence number as a header. On receiver side use sequence number for ordering received packets. If any sequence number misses then you got a packet loss. This way is the best practice for streaming. –  Nov 21 '15 at 13:25
  • @dgate Hi, thanks for all your help. C# is a higher level language and should take care of all this, I am receiving the bytes in chunks and not in 1 go as the code suggests. I could send the amount of bytes that should be received by the client but the issue is that the code thinks there is no more bytes to be recieved: } while (serverStream.DataAvailable); – Andrew Simpson Nov 21 '15 at 13:28

1 Answers1

3

Your problem is that TCP sockets are based around streams, not packets. It's true that "on the wire" everything is a packet, but when you're using TCP, you have no control over how the data is split up into packets or is reassembled into a stream.

In particular, this line of code is incorrect:

await input.ReadAsync(buffer, BufferSize, InputStreamOptions.Partial);

According to the docs, you must use the buffer returned from ReadAsync. Also note that this buffer may be a partial image, and it's up to your code to detect that situation, read more if necessary, and append those blocks together. Also, the buffer may contain part of one image and part of the next image; it's also up to your code to detect that and handle it correctly.

For this reason, most TCP applications use some form of message framing (described in more detail on my blog). Note that getting this right is surprisingly hard.

I strongly recommend that you use SignalR instead of raw TCP sockets. SignalR handles message framing for you, and it is capable of self-hosting (i.e., it does not require ASP.NET).

Stephen Cleary
  • 437,863
  • 77
  • 675
  • 810
  • Hi Stephen. Thanks for your answer. It is useful. The line of code you mentioned is the sample code from MSDN (somewhere) for UWP IoT .devices . What should it be? The docs as usual do nbot give a clean simple example for a simple person like me. I have changed the client code to this: while ( requestInBytes2.Count < gallery.ImageData.Length); – Andrew Simpson Nov 21 '15 at 14:01
  • I have used SignalR before and whilst I like it it does have a problem with transferring byte arrays and that problem is it converts it to base64 which 'bulks' up the data. I do not believe every problem should be solved with signalR. There was a time when signalr did not exist. If anything i would use WebSockets but I do not have that option using uwp for Raspberry Pi. I have successfully used sockets many times before but in this instance it does not seem to work :( – Andrew Simpson Nov 21 '15 at 14:04
  • @AndrewSimpson: The MSDN socket examples are all horrible. Downright wrong. SignalR is not the answer for everything, but (if you can use it on both sides) it is a **lot** easier than raw sockets. – Stephen Cleary Nov 21 '15 at 16:21
  • Hi, I did not mean to imply that the MSDN are ALL horrible. But sometimes they do not give an example code or/and if then they give a whole application where I just want a simple example. Matter of opinion. SignalR is easier (if it can be used) but, a) it does not identify my problem here and b)I prefer to use binary and not base64. My question shows a problem I am having and I only just want to know what the issue could be? I have another app that uses udp that uploads 30 frames per second of jpegs with no issue at all apart from maybe a few dropped frames. I am grateful for your answer – Andrew Simpson Nov 21 '15 at 16:39
  • @AndrewSimpson: The problem is that you need to use message framing. SignalR will do this for you, but since you are using raw sockets, you'll have to do it yourself. – Stephen Cleary Nov 21 '15 at 16:44
  • Hi, yes I do agree with you. However, I cannot use SignalR host in my UwP app as it resides on a Raspberry Pi2 device. Also, I am not too concerned about getting the individual images at this stage. I am just concerned about getting the same bytes back as I had sent – Andrew Simpson Nov 21 '15 at 16:49