I'm using Python 3's builtin functools.lru_cache
decorator to memoize some expensive functions. I would like to memoize as many calls as possible without using too much memory, since caching too many values causes thrashing.
Is there a preferred technique or library for accomplishing this in Python?
For example, this question lead me to a Go library for system memory aware LRU caching. Something similar for Python would be ideal.
Note: I can't just estimate the memory used per value and set maxsize
accordingly, since several processes will be calling the decorated function in parallel; a solution would need to actually dynamically check how much memory is free.