Could we make the lru_cache cache’s entries time bound (TTL) by extending it in seconds and milliseconds.
I see, thanks, is there any plan to push it in the standard lib? It is a very standard usage.
I don’t think we want to add it in the stdlib. The package is non-trivial. It’s a maintenance burden that we don’t want to add to the stdlib as it already exists as a 3rd party library. We want to keep the interface of lru_cache as simple as possible without additional features that are not related to an LRU cache per se (what you want is essentially a TTL cache, not a LRU one; eviction is a bit different semantically speaking).
Ok, thank you, np.
The only reason it was being asked as the 3rd party libraries are not allowed in big corporations, but the standard library. Also, the space and time travel together in the compute.
Also like you said, implementation is trivial if we consider an add on like the following:
def ttls_lru_cache(maxsize=128, ttl=60, typed=False):
“”"
LRU + TTL in seconds cache decorator.
Uses functools.lru_cache internally (for size) and adds time-based expiry.
Compatible across Python versions.
“”"
def decorator(func):
cached_func = lru_cache(maxsize=maxsize, typed=typed)(func)
timestamps = {}
@wraps(func)
def wrapper(*args, **kwargs):
key = args
if kwargs:
key += tuple(sorted(kwargs.items()))
now = time.time()
Expiry check
if key in timestamps and now - timestamps[key] > ttl:
cached_func.cache_clear()
timestamps.clear()
value = cached_func(*args, **kwargs)
timestamps[key] = now
return value
Expose available lru_cache utilities
wrapper.cache_clear = cached_func.cache_clear
wrapper.cache_info = cached_func.cache_info
if hasattr(cached_func, “cache_parameters”):
wrapper.cache_parameters = cached_func.cache_parameters
return wrapper
return decorator
Navnidhi
Also, the space and time travel together in the compute.
I don’t quite understand this but if you are talking about whether the stdlib implementation will be faster, I don’t think it will necessarily be the case as it’s written in pure Python.
The only reason it was being asked as the 3rd party libraries are not allowed in big corporations
In this case, such corporations should consider making their own implementation that fits their needs. This wouldn’t be the sole occurrence of such issue and the stdlib shouldn’t be responsible for that otherwise all popular packages on pypi should be included in the stdlib which is something that we don’t.
Also like you said, implementation is trivial if we consider an add on like the following:
Since it’s trivial for your use cases, you should add this in the projects themselves, and not having it in the stdlib.
I have added it to my profile next yesterday. Thank you.
But for sake of an argument:
I meant to say that without a time reference, the stdlib lru_cach is incomplete. We care about time and space together, ie Big O. a basic of computer science.
The cache could be stale after a time T, and a cached value is wrong and useless like below if my retention is 2t. When 2t=0, it is compatible to the current stdlib version.
Example:
- Cache at T-2t with entry A
- Cache at T-t with two entries A and B.
- Cache at T with two entries A and B, where A is stale and we want to recalculate it and that is an important use case.
OR, people will be having three ways to print variables rather a solid implementation in 10 years, -![]()
And it is not about just Python, in C++ as well, auto pointer and double locking, - a few don’t get it until a decade is passed, -:).
Though you can blame those people proudly and innocently being wrong, giving incomplete and immature API.