
Show HN: Supycache – Simple yet capable caching decorator for Python - devnonymous
https://github.com/lonetwin/supycache
======
RMarcus
This is neat. Kind of reminds me of the built-in LRU cache decorator[1] (as
the author points out). I'd be interested to know what kind of policies there
were to control the cache size (will it just expand to fill up Redis /
memcached?), as well as how cache TTL is handled.

Being able to determine which parameters "matter" is an interesting feature.
It doesn't immediately seem useful (if the value of the parameter won't change
the return value of a function, why does it matter?). It'd be interesting to
see some kind of dynamic support for the keys. For example, having a cached
value for if parameter `a` is larger than parameter `b`, or for if the first
argument of a function is "None".

Great work, and thanks for making it open source! Further props for making it
Python3.

[1]
[https://docs.python.org/3.3/library/functools.html#functools...](https://docs.python.org/3.3/library/functools.html#functools.lru_cache)

~~~
devnonymous
Thanks for the comment @RMarcus.

    
    
      > I'd be interested to know what kind of policies there were to control the
      > cache size (will it just expand to fill up Redis / memcached?), as well as
      > how cache TTL is handled.
    

Currently, there are no policies. Yes, the cache size will expand to as much
data you put into it (without expiring). The nice thing about the way I wrote
this is that any options passed to the decorator that are not handled by
supycache itself are handed over to the `backend` (ie: your memcached or redis
connection object), so that is one way to pass on configuribles to the
backend.

    
    
      > Being able to determine which parameters "matter" is an interesting feature.
      > It doesn't immediately seem useful (if the value of the parameter won't
      > change the return value of a function, why does it matter?).
    

So, the thing is that it is not the 'parameter' that 'matters' it is the
_value_ of the parameter that makes this interesting. Actually, this use case
came out of necessity -- I felt the need to add an `lru_cache` for a method
that operated on some parameters but instead of caching the return value to a
key created by the hash of all the parameters, I needed to create a new key on
just one parameter (actually, one attribute of one parameter -- kinda similar
to the ^user^ object example in the readme).

    
    
      > It'd be interesting to see some kind of dynamic support for the keys. For
      > example, having a cached value for if parameter `a` is larger than parameter
      > `b`, or for if the first argument of a function is "None".
    

Sorry, the docs have not been clear about it but the 'cache_key' can _also_ be
a callable which is only called at run_time (with the args/kwargs) passed to
the decorated function. That'll allow you to define your own cache-key-
creation rule ! I'll update the docs to mention this.

Thanks for your comments ! Glad you like it.

