I'm currently working on a Windows Service that will be handling the acquisition of data from multiple measurement instruments connected via USB to my computer. It will be sending some data to an SQL database but I am also creating another application for the machine to be able to view the data locally in real-time and I'll be dealing with arrays/lists with over 7,000,000 elements in the worst case scenarios.
Currently I'm using WCF with NetNamedPipeBinding for inter-process communication and it works great (I can transfer an array with over 7 million doubles in less than a quarter second). So I don't need answers urgently, but I am curious if there are faster ways of accessing the data in the service quicker or more easily.
I have been thinking of delving into unmanaged memory and having the service return a pointer to the array, or something similar. However I don't want to bother with that if the gains are minimal. It's just that when passing a class (I bet a struct will have less overhead) the performance tanks and I am trying to get a good foundation in case I start dealing with more complex data types.
Service related code
public class testclass
{
public double dub1 {get;set;}
public double dub2 {get;set;}
}
public testclass[] GetList(int n)
{
sw.Restart();
testclass[] numbers = new testclass[n];
for (var i = 0; i < n; i++)
{
numbers[i] = new testclass { dub1 = i, dub2 = i };
}
numbers[0].dub1 = (double)sw.ElapsedMilliseconds;
return numbers;
}
public double[] GetDoubles(int n)
{
sw.Restart();
double[] numbers = new double[n];
numbers[0] = (double)sw.ElapsedMilliseconds;
return numbers;
}
Client Related Code
class Program
{
static void Main(string[] args)
{
while (1 == 1)
{
Console.WriteLine("Size of List");
var number1 = int.Parse(Console.ReadLine());
var sw = new Stopwatch();
var test = new ServiceReference1.CalculatorClient();
sw.Restart();
var list = test.GetDoubles(number1).ToList();
Console.WriteLine("Response Time: "+ sw.ElapsedMilliseconds);
Console.WriteLine("Time to make list: "+list[0]);
}
}
}