This is just an idea, I don't yet have any code, I need some design advice. I would implement a cache ( non distributed in first instance ) by using the MemoryMappedFile in c#. I think it would be good to have a b-tree as an undelying structure, but this is debatable as well. So the question are:
- Is B-tree a good strategy to use to fast search items when the undelaying support is a memory mapped files ?
- What tip and trick do we have with memory mapped files ? How much the view can be large, what are the drawbacks when it is too small or too large ?
- Multithread consideration: how we deal with memory mapped file and concurrency ? Cache are supposed to be higly hitten by clients, what strategy is better to have something performant ?
As @Internal Server Error asked, I integrate the question with this: Key would be a string, about 64 chars max len. The data would be a byte[] about 1024 bytes long but consider an average at 128 bytes, or better: what I want to cache are OR/M entities, let's consider how long is a serialized entity in bytes with something like a BSOn serializer.