Questions tagged [lru]

LRU is a family of caching algorithms, where LRU stands for least recently used.

LRU is a family of caching algorithms, where LRU stands for least recently used. As the name implies, a cache discards the least recently used items from the cache, when the contents exceed a certain threshold. In this algorithm, writes are quite expensive, as you always need to write the change the usage information when you use an item.

More information at Wikipedia.

304 questions
9
votes
2 answers

LRU cache on disk for python

Am looking for on disk LRU cache package in Python. Most of them are in memory cache. Main reason is Database access is slow and have limited RAM for in memory LRU. However, large and fast SSD for LRU cache.
quantCode
  • 495
  • 1
  • 5
  • 12
9
votes
1 answer

What Does Memcached's LRU Actually Mean?

Memcached says it uses an LRU queue to do eviction (with a few rules based around slab sizes mixed in.) When they say least-recently-used, are they referring to least recently stored or least recently read? Their documentation seems ambiguous here.
dave mankoff
  • 17,379
  • 7
  • 50
  • 64
8
votes
5 answers

Usage for lru cache in functools

I want to use lru_cache in my code, however, I get this error: NameError: name 'lru_cache' is not defined I do have an import functools in my code but that does not help Example code is…
user308827
  • 21,227
  • 87
  • 254
  • 417
8
votes
1 answer

C# Production-quality Thread-Safe in-memory LRU Cache with Expiry?

This may be like asking for the moon on a stick; but is there a is there a "C# Production-quality Thread-Safe in-memory LRU Cache with Expiry? Or does anyone have a best-practices idea to achieve the same thing? (LRU being "Least Recently Used" -…
Jon Rea
  • 9,337
  • 4
  • 32
  • 35
8
votes
2 answers

How is an LRU cache implemented in a CPU?

I'm studying up for an interview and want to refresh my memory on caching. If a CPU has a cache with an LRU replacement policy, how is that actually implemented on the chip? Would each cache line store a timestamp tick? Also what happens in a dual…
fred basset
  • 9,774
  • 28
  • 88
  • 138
7
votes
3 answers

How can I make my simple .NET LRU cache faster?

Last night and tonight I tried a few different approaches and came up with one similar to the one laid out below by Jeff (I had even already done what he suggested in his update, and put together my own simple LL implementation for additional…
David Hay
  • 3,027
  • 2
  • 27
  • 29
7
votes
3 answers

Can One Replace or Remove a specific key from functools.lru_cache?

I'm using a functools.lru_cache to serve temp file paths given certain input*. However in case the path no longer exists, I would like to remove/replace the single corresponding key. The cache_clear() method would be overkill and cache_info() do not…
Fletch F Fletch
  • 411
  • 5
  • 7
7
votes
2 answers

Clearing lru_cache of certain methods when an attribute of the class is updated?

I have an object with a method/property multiplier. This method is called many times in my program, so I've decided to use lru_cache() on it to improve the execution speed. As expected, it is much faster: The following code shows the problem: from…
agiap
  • 503
  • 1
  • 6
  • 16
7
votes
4 answers

algorithm LRU, how many bits needed for implement this algorithm?

I have a little question about the algorithm LRU. If you have a cache with four blocs , how many bits do you need to implement this algorithm ?
Latsuj
  • 469
  • 1
  • 5
  • 14
7
votes
1 answer

Redis maxmemory-policy: performances of volatile-lru vs allkeys-lru

assuming all keys in a redis instance have an expire set, volatile-lru and allkeys-lru are similar. But is there a significative performance difference between the 2 when a key is removed? Bonus question: between 2 distinct instances configured with…
colinux
  • 3,989
  • 2
  • 20
  • 19
6
votes
6 answers

How to design a latest recently used cache?

How to design a latest recently used cache? Suppose that you have visited some items. You need to design a data structure to hold these items. Each item is associated with the latest visited time. Each time when you visit an item, check it in…
user1002288
  • 4,860
  • 10
  • 50
  • 78
6
votes
2 answers

Make built-in lru_cache skip caching when function returns None

Here's a simplified function for which I'm trying to add a lru_cache for - from functools import lru_cache, wraps @lru_cache(maxsize=1000) def validate_token(token): if token % 3: return None return True for x in range(1000): …
Max
  • 668
  • 4
  • 13
6
votes
2 answers

Why does python lru_cache performs best when maxsize is a power-of-two?

Documentation says this: If maxsize is set to None, the LRU feature is disabled and the cache can grow without bound. The LRU feature performs best when maxsize is a power-of-two. Would anyone happen to know where does this "power-of-two" come…
Mr Matrix
  • 1,128
  • 2
  • 14
  • 28
6
votes
1 answer

Is there a database based key eviction policy in redis when RAM is full

I am using 5 databases in my redis server. I want to evict keys belonging to a particular DB using LRU mechanism. Is it possible ? I read this: how-to-make-redis-choose-lru-eviction-policy-for-only-some-of-the-keys. But all my databases are using…
revs
  • 63
  • 7
5
votes
1 answer

Default memory cache with LRU policy

I am trying to implement some caching in my application and i want to use the default memory cache in C# (this requirement can be changed if this it wont work). My problem is that don't want to exceed the maximum amount of physical memory i have on…
aweis
  • 5,350
  • 4
  • 30
  • 46
1 2
3
20 21