Expanding lru_cache to allow caching data without a function call

Is this because of dict being a python data structure and cached data being a C different one (idk which one btw, I am curious about this) ?
Just an idea, maybe not a good one, idk (I am no proficient in C)… but what about merging two caches together (updating of a cache from another cache) in the C layer ? Then I suppose a wrapper trick a bit like @blhsing’s one could be used to provide proper API in the python layer.

You suggested simply passing in {key_1 : value_1, ...}. This is reasonable enough for your own code, for a function you know about, that is only called with hashable args.

But to work with lru_cache, firstly the target function f could be called with f(list_1) (a list cannot be a dict key) and secondly, internally in the Python implementation, lru_cache uses a cache structure like {make_key(key_1) : [prev, next, value_1, make_key(key_1)], ...}.

With some refactoring, the simple dict can be passed in and transformed into one lru_cache can use. But if that is done, cache_put might as well be exposed in the API and users can just fill their boots.

1 Like

lru_cache requires that the wrapped function’s args and hearts are all hashable. From the lru_cache docs:

Since a dictionary is used to cache results, the positional and keyword arguments to the function must be hashable.

That said, this seems like something that should obviously go in a package on pypi. If it becomes wildly popular, then maybe someone can argue that it should be moved into the standard library, but that seems unlikely.

Hmm I made the return value setter a separate function just in case the wrapped function also has a keyword argument of return_value that conflicts with cache_put’s return_value argument, and for friendlier usage I also want to stick to unpacked args and kwargs instead of passing args and kwargs as a tuple and a dict. But yeah, for easier illustration I’ve modified my codes above to match the signature that the OP proposed in the first paragraph (rather than in the OP’s sample code).

1 Like

Thanks - I didn’t realise that.

Oh of course, it’s being attached to another function / instance. I didn’t realise setters could be created like that without the special decorator or dunder method.

1 Like

Ah, I see now. For some causes (perhaps it was late night here) I misread your example as get_data_on_all_users instead of get_all_data_on_user. Your use case is more interesting, I’ve encountered this in my practice. Usually, implementing the caching manually via a dict lookup works pretty well. lru_cache would not make significant difference, and it not always can be used (for methods, for example). I am sure there are use cases where lru_cache could be applied and where it would make difference, but I’m not sure there are enough of them. You need to convince @rhettinger which is not present here (you can contact him with Twitter or GitHub).