Have the option to call the function before accessing the cache with lru_cache

As we well know, the lru_cache function (from functools) receives two parameters: maxsize and typed.
It happens that in some cases the function to which we apply cache is based on a call to an external API. For example, these days I had a function that brought weather information from an external API. I must use this function many times on my server because it is part of training machine learning models, but lately the API has been a little unstable. I wouldn’t want to rely entirely on the cache since it could happen that the external application updates the weather data so I would be missing information. But this doesn’t happen that often, and instead of causing my experiment to fail, I would like to have the cache backed up.
The idea is simple:
First option: Call the function normally, if an exception occurs, then I try to access the result through the cache.
I’m currently working on this change, but I’d like to hear your thoughts on it. Thank you so much.

1 Like

Fundamentally, the purpose of the cache is specifically to avoid the call. It’s designed as a performance optimization, not a fallback mechanism.

If you want to do this for some reason, it’s easy enough in user code. Supposing we have

@functools.lru_cache() # or `@functools.cache` in 3.9 onward
def example():

Then we can get the result you want by:

    result = example.__wrapped__()
    result = example()

Edit: Keep in mind that when a call to example.__wrapped__() (i.e., the original function that was passed to the decorator) succeeds, that will not store a value in the cache. If you want to store such values to possibly use later, and want to use a cache specifically of fallback values, then you will probably be better off making your own decorator from scratch for that purpose. Although if you need to be able to handle arbitrary arguments, it won’t be easy. You can’t just accept (*args, **kwargs) in the wrapper function and use (args, kwargs) as a key, because it won’t be hashable.