1

This is just an idea, I don't yet have any code, I need some design advice. I would implement a cache ( non distributed in first instance ) by using the MemoryMappedFile in c#. I think it would be good to have a b-tree as an undelying structure, but this is debatable as well. So the question are:

  • Is B-tree a good strategy to use to fast search items when the undelaying support is a memory mapped files ?
  • What tip and trick do we have with memory mapped files ? How much the view can be large, what are the drawbacks when it is too small or too large ?
  • Multithread consideration: how we deal with memory mapped file and concurrency ? Cache are supposed to be higly hitten by clients, what strategy is better to have something performant ?

As @Internal Server Error asked, I integrate the question with this: Key would be a string, about 64 chars max len. The data would be a byte[] about 1024 bytes long but consider an average at 128 bytes, or better: what I want to cache are OR/M entities, let's consider how long is a serialized entity in bytes with something like a BSOn serializer.

Felice Pollano
  • 32,832
  • 9
  • 75
  • 115

1 Answers1

1
  • B-Tree is good (with memory-mapped files), but if the file is not always entirely kept in resident memory then a page-aligned B+Tree is much better. See also.
  • The trick with memory-mapped files is to use a 64-bit architecture so that you can map the entire file into memory, otherwise you'd have to only map the parts and the cached reads might be faster than mmaps.
  • Try CAS (compare-and-swap) over the shared memory. See also.
Community
  • 1
  • 1
ArtemGr
  • 11,684
  • 3
  • 52
  • 85
  • Maybe there is something C#-pish for you at [nosql-database.org](http://nosql-database.org/). Or [some C project](http://highlandsun.com/hyc/mdb/) might serve as a reference. – ArtemGr Oct 21 '12 at 14:23