At the moment I'm working on a project that contains a fair amount of legacy code which includes the use of non-generic collections such as .NET's ArrayList
, HashTable
, etc.
I know that using these types of collections for primitive types is a terrible idea performance-wise as mentioned by List's documentation in the "Performance considerations" section (and which I confirmed for myself again with a quick & naive LinqPad query - attached at the end).
At first glance there doesn't seem to be any problem doing a sort of search/replace operation to replace these old collections. But since it will affect a large portion of the codebase I'm worried that there will be side-effects where List<T>
doesn't behave as "expected", given ArrayList
's specific behaviour which the applications already rely on.
Has anyone done this type of conversion on a large scale before? If yes were there subtle problems not mentioned in the .NET documentation?
void Main()
{
var size = 1000000;
var array = new int[size];
var list = new List<int>();
var arrayList = new ArrayList();
Console.WriteLine("Testing " + size + " insertions...");
Console.WriteLine();
var stopwatch = Stopwatch.StartNew();
for (var i = 0; i < size; i++)
{
array[i] = i;
}
stopwatch.Stop();
Console.WriteLine("int[]: " + stopwatch.Elapsed.TotalMilliseconds + "ms");
stopwatch.Restart();
for (var i = 0; i < size; i++)
{
list.Add(i);
}
stopwatch.Stop();
Console.WriteLine("List<int>: " + stopwatch.Elapsed.TotalMilliseconds + "ms");
stopwatch.Restart();
for (var i = 0; i < size; i++)
{
arrayList.Add(i);
}
stopwatch.Stop();
Console.WriteLine("ArrayList: " + stopwatch.Elapsed.TotalMilliseconds + "ms");
}
Output on my machine:
Testing 1000000 insertions...
int[]: 3,1063ms
List<int>: 7,2291ms
ArrayList: 111,5214ms
Multiple runs almost always show ArrayList
an order of magnitude slower than either int[]
or List<int>
.