Scala中有SoftHashMap吗?
我知道 java 的这个问题,但这些实现似乎都没有与 scala.collection.JavaConversions 配合良好。
我正在寻找一些简单的东西(例如单个文件,不是整个库)来实现SoftHashMap
,这样它就可以与Scala Map
很好地配合( ie 支持 getOrElseUpdate
、unzip
以及其余的 Scala Map
方法)。
I'm aware of this question for java, but none of those implementations seem to play well with scala.collection.JavaConversions
.
I'm looking for something simple (e.g. single file, not a whole library) that implements SoftHashMap
such that it plays well with Scala Map
(i.e. supports getOrElseUpdate
, unzip
, and the remaining Scala Map
methods).
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
实现灵感来自这个java
WeakHashMap
:Implementation inspired by this java
WeakHashMap
:我在 liftweb 上找到了一个。
我还没用过,请自行检查。
http://scala-tools.org/mvnsites /liftweb-2.4-M5/net/liftweb/util/SoftReferenceCache.html
I found one in liftweb.
I do not use it yet, so check it yourself please.
http://scala-tools.org/mvnsites/liftweb-2.4-M5/net/liftweb/util/SoftReferenceCache.html
偶尔你会被问到这样的问题,“用一根棍子戳出你的眼睛的最好方法是什么”,你可以竭尽全力回答在末端刻一个 1 英寸的钩子后应该如何硬化和消毒棍子,然后按照以及棍子应该插入的位置等等。事实上,最好的答案可能并不完全是所问的问题 - 而是关于为什么你首先要这样做的问题!
这是这些问题之一。
软引用最初听起来像是您可能想要的东西。在出现 GC 压力之前不会进行垃圾收集的引用。据推测,您会使用它来缓存值得缓存的内容,通常是因为首先创建它的成本很高。
问题是,当 GC 面临压力时,SoftRefs 几乎会在您不希望它们清除时清除!这意味着当虚拟机已经很忙并且面临 GC 压力时,需要重新创建它们(昂贵的操作)。
此外,无法向虚拟机提示软引用对象的优先级。用于选择要清除的对象的特定算法未指定且依赖于 VM。
从本质上讲,软引用是一种将应用程序级别的问题(缓存)卸载到垃圾收集器的错误尝试。你真的不应该*实际使用它们。
*从不,对一些非常小且非常专业的用例进行取模
Occasionally you get asked a question like, "what's the best way to poke your eye out with a stick" that you can go to great lengths answering how you should harden and sterilise the stick after carving a 1 inch hook into the end, and follow up with where the stick should be inserted and so on. Really though, the best answer is probably not exactly what was asked – but the question as to why on earth you'd want to do this in the first place!
This is one of those questions.
SoftReferences are something that initially sound like something you might want. A reference that does not get garbage collected until there is GC pressure. Presumably, you'd use this to cache something that was worth caching, usually because it was expensive to create in the first place.
The problem is, SoftRefs clear almost exactly when you don't want them to, when the GC is under pressure! This means that they will need to be recreated (expensive op) right when the VM is already busy and under GC pressure.
Furthermore, there is no way to hint the VM as to the priority of objects that are soft-referenced. The particular algorithm used for selecting which objects to clear is unspecified and VM dependent.
Essentially, SoftReferences are a misguided attempt to off-load an application level concern (caching) to the garbage-collector. You really should never* actually use them.
*never, modulo some very small and very specialised use-cases
正如其他人所观察到的,
SoftReference
通常不是构建缓存的正确选择。然而,一些库提供了更好的替代品。虽然OP要求不使用库,但我仍然认为这是最好的答案。另外,使用 SBT 下载库非常简单。在
build.sbt
中,假设您正在使用 SBT >= 0.10(使用 0.12 进行测试)构建项目,请添加:在客户端代码中,您可以按如下方式构建地图(查看 CacheBuilder 的选项各种参数的含义;以下是我为我的用例选择的参数):
对于 Scala 2.10:
对于 Scala 2.9(已弃用/不在 2.10 中编译):
要使其适用于两个版本,请显式调用隐式转换- 它是
JavaConversions.asScalaConcurrentMap
。我已经在 scala 语言邮件列表上报告了该问题(并且还将在错误跟踪器上报告该问题),因此我希望 2.9 代码至少能在 2.10 中编译(同时仍然会导致弃用警告):https://groups.google.com/d/topic/scala -语言/uXKRiGXb-44/讨论。
As other people have observed,
SoftReference
s are usually not The Right Thing to build a cache. However, some libraries provide better replacements. While the OP requires not using a library, I still think this is the best answer possible. Plus, with SBT downloading the library is quite simple.In
build.sbt
, assuming you're building your project with SBT >= 0.10 (tested with 0.12), add:In client code, you can build your map as follows (look into CacheBuilder's option for the meaning of the various parameters; the ones below are the ones I chose for my use case):
For Scala 2.10:
For Scala 2.9 (deprecated/not compiling in 2.10):
To make it work for both versions, call the implicit conversion explicitly - it's
JavaConversions.asScalaConcurrentMap
. I've reported the problem on the scala-language mailing list (and will also report it on the bug tracker), so I hope that the 2.9 code will at least compile in 2.10 (while still causing deprecation warning):https://groups.google.com/d/topic/scala-language/uXKRiGXb-44/discussion.