将 @functools.lru_cache 与字典参数一起使用
我有一个方法,它采用(除其他外)字典作为参数。该方法正在解析字符串,并且字典提供了某些子字符串的替换,因此它不必是可变的。
这个函数被经常调用,并且在冗余元素上被调用,所以我认为缓存它会提高它的效率。
但是,正如您可能已经猜到的那样,由于 dict
是可变的,因此不可散列,因此 @functools.lru_cache
无法装饰我的函数。那么我该如何克服这个问题呢?
如果只需要标准库类和方法,那就加分了。理想情况下,如果标准库中存在某种我还没有见过的frozendict
,那会让我很开心。
PS:namedtuple
仅在最后手段下使用,因为它需要很大的语法转换。
I have a method that takes (among others) a dictionary as an argument. The method is parsing strings and the dictionary provides replacements for some substrings, so it doesn't have to be mutable.
This function is called quite often, and on redundant elements so I figured that caching it would improve its efficiency.
But, as you may have guessed, since dict
is mutable and thus not hashable, @functools.lru_cache
can't decorate my function. So how can I overcome this?
Bonus point if it needs only standard library classes and methods. Ideally if it exists some kind of frozendict
in standard library that I haven't seen it would make my day.
PS: namedtuple
only in last resort, since it would need a big syntax shift.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(9)
不要使用自定义的可哈希字典,而是使用它并避免重新发明轮子!这是一本冻结的字典,全部都是可散列的。
https://pypi.org/project/frozendict/
代码:
然后
代码取自@fast_cen的答案
注意:这不适用于递归数据结构;例如,您可能有一个参数,它是一个不可散列的列表。我们邀请您进行递归包装,以便深入数据结构并使每个
dict
冻结和每个list
元组。(我知道OP不再想要解决方案,但我来这里寻找相同的解决方案,所以把这个留给后代)
Instead of using a custom hashable dictionary, use this and avoid reinventing the wheel! It's a frozen dictionary that's all hashable.
https://pypi.org/project/frozendict/
Code:
and then
Code taken from @fast_cen 's answer
Note: this does not work on recursive datastructures; for example, you might have an argument that's a list, which is unhashable. You are invited to make the wrapping recursive, such that it goes deep into the data structure and makes every
dict
frozen and everylist
tuple.(I know that OP nolonger wants a solution, but I came here looking for the same solution, so leaving this for future generations)
这是一个使用 @mhyfritz 技巧的装饰器。
只需将其添加到 lru_cache 之前即可。
Here is a decorator that use @mhyfritz trick.
Simply add it before your lru_cache.
创建一个可散列的 dict 类怎么样:
What about creating a hashable
dict
class like so:基于@Cedar回答,按照建议添加深度冻结的递归:
Based on @Cedar answer, adding recursion for deep freeze as suggested:
如何子类化
namedtuple
并通过x["key"]
添加访问权限?How about subclassing
namedtuple
and add access byx["key"]
?这是一个可以像 functools.lru_cache 一样使用的装饰器。但这是针对仅采用一个参数的函数,该参数是具有可哈希值的平面映射并且具有固定的
maxsize 为 64。对于您的用例,您必须调整此示例或您的客户端代码。另外,要单独设置 maxsize,必须实现另一个装饰器,但我没有考虑这个,因为我不需要它。
对于更通用的方法,可以使用第三方的装饰器 @cachetools.cache具有适当函数设置为
key
的库。Here's a decorator that can be used like
functools.lru_cache
. But this is targetted at functions that take only one argument which is a flat mapping with hashable values and has a fixedmaxsize
of 64. For your use-case you would have to adapt either this example or your client code. Also, to set themaxsize
individually, one had to implement another decorator, but i haven't wrapped my head around this since i don't needed it.For a more generic approach one could use the decorator @cachetools.cache from a third-party library with an appropriate function set as
key
.在决定暂时放弃我们的用例的 lru 缓存后,我们仍然想出了一个解决方案。该装饰器使用 json 来序列化和反序列化发送到缓存的 args/kwargs。适用于任意数量的参数。使用它作为函数的装饰器而不是@lru_cache。最大大小设置为 1024。
After deciding to drop lru cache for our use case for now, we still came up with a solution. This decorator uses json to serialise and deserialise the args/kwargs sent to the cache. Works with any number of args. Use it as a decorator on a function instead of @lru_cache. max size is set to 1024.
解决方案可能要简单得多。 lru_cache使用参数作为缓存的标识符,因此对于字典来说,lru_cache不知道如何解释它。您可以将字典参数序列化为字符串,然后在函数中反序列化为字典。就像魅力一样。
功能:
调用:
The solution might be much simpler. The lru_cache uses parameters as the identifier for caching, so in the case of the dictionary, lru_cache doesn't know how to interpret it. You can serialize dictionary parameter to string and unserialize in the function to the dictionary back. Works like a charm.
the function:
the call:
@Cedar 答案 的扩展,添加递归冻结:
递归冻结:
装饰器:
An extension of @Cedar answer, adding recursive freeze:
The recursive freeze:
The decorator: