I was trying to optimize memory usage of a particular service and stumbled upon a huge dictionary cache which gets queried for random entries very frequently. The problem is this dictionary takes up more than 1 GB and the service is almost touching 2GB (32 Bit). The dictionary once constructed is never altered.
The dictionary key and values are strings. Is there a way to compress the entire dictionary and it still be indexed? I wrote a small POC which uses Huffman encoding sharing codes between all entries and is indexed on compressed keys. but I want to know if there are any better alternatives.
The options I'll have to rule out due to various reasons - Using database or external storage as it becomes extremely slow & - All the entries get used atleast once within a few minutes, so I also ruled out lazy loading. - Using distibuted cache