Questions tagged [lru]

LRU is a family of caching algorithms, where LRU stands for least recently used.

LRU is a family of caching algorithms, where LRU stands for least recently used. As the name implies, a cache discards the least recently used items from the cache, when the contents exceed a certain threshold. In this algorithm, writes are quite expensive, as you always need to write the change the usage information when you use an item.

More information at Wikipedia.

304 questions
4
votes
2 answers

Why on a reference string and its reverse page faults are same by LRU and Optimal Page Replacement Algorithm?

I was reading Virtual Memory from Operating System Concepts by Galvin and came across a statement, it says: "We can think of LRU strategy as the optimal page-replacement algorithm looking backward in time, rather than forward." Then on the other…
Ayush
  • 2,608
  • 2
  • 22
  • 31
4
votes
1 answer

Does System.Web.Caching utilize an LRU algorithm?

I was just working on the documentation for an open source project I created awhile back called WebCacheHelper. It's an abstraction on top of the existing Cache functionality in System.Web.Caching. I'm having trouble finding the details of the…
Steve Wortham
  • 21,740
  • 5
  • 68
  • 90
4
votes
2 answers

Is it possible to use Cassandra as a LRU cache

I would like to store key-value pair in Cassandra and have entry Automatically deleted in LRU fashion when a fixed storage size is reached. Is it possible to do this using Cassandra, if so what would be the best way to do it. If not is there any…
skyde
  • 2,816
  • 4
  • 34
  • 53
4
votes
5 answers

Why is LRU better than FIFO?

Why is Least Recently Used better than FIFO in relation to page files?
daniel
  • 73
  • 2
  • 2
  • 4
4
votes
0 answers

two way set associative cache referencing using lru

I am trying to understand how caching works. I am working on a problem to better understand this concept: Given a 2 way set associative cache with blocks 1 word in length, with the total size being 16 words of length 32-bits, is initially empty,…
basil
  • 690
  • 2
  • 11
  • 30
4
votes
1 answer

LRU cache on hard drive python

I want to be able to decorate a function as you would do with functools.lru_cache, however, I want the results to be cached on the hard drive and not in memory. Looking around, I get a feeling this is a solved problem, and I was wondering if anyone…
dardila2
  • 79
  • 1
  • 5
4
votes
1 answer

LRU cache in C++

Possible Duplicate: LRU cache design I got this question in a programming interview. Feel free to think about how you might answer it. How would you implement an LRU (least-recently-updated) cache in C++? Basically, the cache can hold up to N…
JJ Beck
  • 5,193
  • 7
  • 32
  • 36
4
votes
1 answer

Algorithm for lock-free queue with move-to-tail functionality

For an LRU cache I need an algorithm for a lock-free queue similar to the one described in the paper Simple, Fast, and Practical Non-Blocking and Blocking Concurrent Queue Algorithms But to maintain an LRU queue I will also need the possibility to…
ANisus
  • 74,460
  • 29
  • 162
  • 158
4
votes
3 answers

Variable size LRU cache

I am trying to implement an LRU cache in Java which should be able to: Change size dynamically. In the sense that I plan to have it as SoftReference subscribed to a ReferenceQueue. So depending upon the memory consumption, the cache size will…
Jatin
  • 31,116
  • 15
  • 98
  • 163
4
votes
2 answers

Thread-safe (Goroutine-safe) cache in Go

Question 1 I am building/searching for a RAM memory cache layer for my server. It is a simple LRU cache that needs to handle concurrent requests (both Gets an Sets). I have found https://github.com/pmylund/go-cache claiming to be thread safe. This…
ANisus
  • 74,460
  • 29
  • 162
  • 158
3
votes
1 answer

Pythonic approach to keeping track of cached variable/function dependencies

I have a system with a library which includes many functions/methods that are slow, for example SQL queries or computational expensive algorithms. Therefore, I have identified those that can benefit from caching and use the lru_cache or cache…
dogAwakeCat
  • 311
  • 2
  • 15
3
votes
6 answers

LRU vs FIFO vs Random

When there is a page fault or a cache miss we can use either the Least Recently Used (LRU), First in Fist Out (FIFO) or Random replacement algorithms. I was wondering, which one provides the best performance aka the least possible future cache…
rrazd
  • 1,741
  • 2
  • 32
  • 47
3
votes
1 answer

How can I implement an expiring LRU cache in elisp?

I have data that consists of the following three components: a_path a_key a_value =f(a_path, a_key) a_value is expensive to calculate, so I want to calculate it infrequently. In an ideal world, that would be only when it's going to change. So, my…
Chris R
  • 17,546
  • 23
  • 105
  • 172
3
votes
1 answer

C++ Best container for really simple LRU cache

I need to implement a really simple LRU cache which stores memory addresses. The count of these addresses is fixed (at runtime). I'm only interested in the last-recently used address (I don't care about the order of the other elements). Each…
0xbadf00d
  • 17,405
  • 15
  • 67
  • 107
3
votes
1 answer

How to combine @singledispatch and @lru_cache?

I have a Python single-dispatch generic function like this: @singledispatch def cluster(documents, n_clusters=8, min_docs=None, depth=2): ... It is overloaded like this: @cluster.register(QuerySet) @lru_cache(maxsize=512) def _(documents, *args,…
Carsten
  • 1,912
  • 1
  • 28
  • 55