Questions tagged [lru]

LRU is a family of caching algorithms, where LRU stands for least recently used.

LRU is a family of caching algorithms, where LRU stands for least recently used. As the name implies, a cache discards the least recently used items from the cache, when the contents exceed a certain threshold. In this algorithm, writes are quite expensive, as you always need to write the change the usage information when you use an item.

More information at Wikipedia.

304 questions
5
votes
0 answers

Python LRU cache's global and per instance behavioral differences

I'm going through the implementation details of Python's LRU cache decorator. To understand the behavior of the lru_cache decorator in different scenarios properly, I've also gone through the following SO answers: Python LRU Cache Decorator Per…
Redowan Delowar
  • 1,580
  • 1
  • 14
  • 36
5
votes
0 answers

How to lru_cache a function when arguments are object instances and the cache needs to look into instance attributes?

I am trying to implement the lru_cache on a function that takes in a python object as an argument. The function should return a cached value only if it's argument's attributes have not changed. However, it looks like the lru_cache only does a…
Anubhav
  • 545
  • 3
  • 14
5
votes
4 answers

limit the size of a std::set

I have a short question about the std::set container. Right now I am feeding my set using the pushback function. Of corse the set becomes larger and larger for every push_back. I am only intrested in the latest 30 elements or so... The older…
Lumpi
  • 2,697
  • 5
  • 38
  • 47
5
votes
3 answers

LRU Cache Implementation in Java

I have seen the following code, and I think that there is a useless while loop in the implementation of addElement method. It should never happen to have more elements than size+1 since there is already a write lock. So why is the addElement method…
Sandeep Kaul
  • 2,957
  • 2
  • 20
  • 36
5
votes
1 answer

Implementing LRU with timestamp: How expensive is memory store and load?

I'm talking about LRU memory page replacement algorithm implement in C, NOT in Java or C++. According to the OS course notes: OK, so how do we actually implement a LRU? Idea 1): mark everything we touch with a timestamp. Whenever we need to…
Junji Zhi
  • 1,382
  • 1
  • 14
  • 22
5
votes
3 answers

Android LRUCache Retrieval

I have implemented a standard LRUCache in Android that stores Objects. Each key is a unique ObjectId associated with the Object stored. My problem is that the only way to retrieve an Object from cache is by the ObjectId (no iterator). What would be…
Nelson.b.austin
  • 3,080
  • 6
  • 37
  • 63
4
votes
2 answers

Redis Internals - LRU Implementation For Sampling

Does someone know about the internals of Redis LRU based eviction / deletion. How does Redis ensure that the older (lesser used) keys are deleted first (in case we do not have volatile keys and we are not setting TTL expiration)? I know for sure…
4
votes
1 answer

Python lru_cache: how can currsize < misses < maxsize?

I have a class with a method that is annotated with the lru_cache annotation: CACHE_SIZE=16384 class MyClass: [...] @lru_cache(maxsize=CACHE_SIZE) def _my_method(self, texts: Tuple[str]): def…
Carsten
  • 1,912
  • 1
  • 28
  • 55
4
votes
1 answer

How to combine dataclass, property, and lru_cache

I'm trying to combine dataclasses, properties and lru_caches for some computational science code: from dataclasses import dataclass from typing import Any from functools import lru_cache @dataclass class F: a: Any = 1 b: Any = 2 c: Any =…
Bananach
  • 2,016
  • 26
  • 51
4
votes
3 answers

pickling lru_cached function on object

As part of parallellizing some existing code (with multiprocessing), I run into the situation that something similar to the class below needs to be pickled. Starting from: import pickle from functools import lru_cache class Test: def…
Bram
  • 618
  • 4
  • 14
4
votes
1 answer

LRU Cache in Node js

I need to implement caching for my project(for my organisation), we are planing to have a in-memory LRU caching, I have got some packages but I am not sure about the licensing term, the best I found was…
user3649361
  • 944
  • 4
  • 20
  • 40
4
votes
1 answer

LRU cache on top of shared preferences

I would like to create an LRU cache that is backed by the shared preferences. Basically I would like to store a specific number of strings (around 20) and have an LRU behavior. I know that LinkedHashMap is an LRU in java but is there a way to…
Jim
  • 18,826
  • 34
  • 135
  • 254
4
votes
2 answers

In which case LFU is better than LRU?

I have been trying to find a good case in which LFU is better than LRU but I am not sure for that. What I have managed to do (but not sure if it is a good example) is the case when you have a cache with capacity 3 and the cache requests are 4 (like…
Argiris
  • 189
  • 2
  • 15
4
votes
2 answers

Erlang LRU Cache

How to implement a LRU Cache with Erlang? LRU Cache Wiki Top starred Github project was fogfish/cache, but Segmented table was not quite fit for my data. barrel-db/erlang-lru was using a List. after testing, it would be slow if there were too much…
user3644708
  • 2,466
  • 2
  • 16
  • 32
4
votes
2 answers

LinkedHashMap LRU Cache - Determine what values are removed?

Background Information You can make a LRU cache with a LinkedHashMap as shown at this link. Basically, you just: Extend linked hash map. Provide a capacity parameter. Initialize the super class (LinkedHashMap) with parameters to tell it its…
John Humphreys
  • 37,047
  • 37
  • 155
  • 255