I am trying to build a Lookup Data Structure in C# for huge data. The current plan is for it to able to scale to 1 Billion entities without affecting performance. The search performance should be in nanoseconds.
Currently, I have experimented with Lucene.Net and MongoDB. The problem with both of them is that they take hours to insert this much records. And then after that, their performance is in milliseconds.
On the other hand, I have tried using List and ConcurrentBag in C#. It satisfies the performance constraints but with 1 billion records the collection takes around 78GB memory in RAM.
Is there any better way to work around this?