0

I am sending and receiving large arrays of objects to and from a WCF service. I am splitting the array into chunks and sending those, so that each call is not too large.

The code looks something like this:

//split data up so there is no "400 - Bad Request" error
int length = 2000;
if (requests.Length > length)
{
    CustomerMatching[] array = new CustomerMatching[length];
    for (int first = 0; first < requests.Length; first += length)
    {
        if (first + length >= requests.Length)
        {
            length = requests.Length - first;
            array = new CustomerMatching[length];
        }

        Array.Copy(requests, first, array, 0, length);

        r += client.RequestCustomerMatchings(array);
    }
}
else
{
    //Should be able to stream the whole dataset in one go
    r = client.RequestCustomerMatchings(requests);
}

This works fine, but it is much slower than sending it all in one go, and I suspect that having to copy the entire array to send it doesn't help.

Can I send slices of the array one at a time via WCF without copying each chunk?

TMVector
  • 29
  • 6
  • I am not sure how much copying will go on behind the scenes but you could "slice" up an array using `Skip().Take()` linq methods. – Ben Robinson Sep 18 '14 at 09:06
  • As @BenRobinson suggested the best way would be dividing it into an even chunks and send them using .Take(i), where I can be the iteration of chunks or .Take(10). – foo-baar Sep 20 '14 at 17:10

2 Answers2

1

You said that this solution is much slower than sending the whole thing in one go, but there could be many reasons for that. For example, you send all your chunks sequentially, one after the other. Depending on the logic of your service, that might be too slow.

Consider perhaps sending the requests in parallel. That way, your overall latency should be reduced (but probably not as small as a single request), and you avoid using large memory buffers, which can be harder for .net to dispose of.

Assuming that Requests.Length is not too big compared to Length (lets say, 10-10000 times) i seriously doubt that copying the arrays is your bottleneck.

Roman
  • 2,108
  • 1
  • 18
  • 20
0

If your returned result is not ridiculously large [1TB] ideally it should be done in one single call however you should amend your config values specially maxArrayLength as described in the following to support this feature: Maximum array length quota

As Ben suggested you should only return only the requested result and skip the unnecessary items.

Community
  • 1
  • 1
MHOOS
  • 5,146
  • 11
  • 39
  • 74