使用MapMaker创建缓存
我想使用MapMaker创建一个缓存大型对象的地图, 如果没有足够的内存,应将其从缓存中删除。 这个小演示程序似乎工作正常:
public class TestValue {
private final int id;
private final int[] data = new int[100000];
public TestValue(int id) {
this.id = id;
}
@Override
protected void finalize() throws Throwable {
super.finalize();
System.out.println("finalized");
}
}
public class Main {
private ConcurrentMap<Integer, TestValue> cache;
MemoryMXBean memoryBean;
public Main() {
cache = new MapMaker()
.weakKeys()
.softValues()
.makeMap();
memoryBean = ManagementFactory.getMemoryMXBean();
}
public void test() {
int i = 0;
while (true) {
System.out.println("Etntries: " + cache.size() + " heap: "
+ memoryBean.getHeapMemoryUsage() + " non-heap: "
+ memoryBean.getNonHeapMemoryUsage());
for (int j = 0; j < 10; j++) {
i++;
TestValue t = new TestValue(i);
cache.put(i, t);
}
try {
Thread.sleep(100);
} catch (InterruptedException ex) {
}
}
}
/**
* @param args the command line arguments
*/
public static void main(String[] args) {
Main m = new Main();
m.test();
}
}
但是,当我在实际应用程序中执行相同的操作时,条目是 基本上一旦添加就从缓存中删除。在我真实的 应用程序中,我也使用整数作为键,缓存的值是 从包含一些数据的磁盘读取的存档块。据我所知 理解,弱引用一旦被垃圾收集 不再使用,所以这似乎是有道理的,因为密钥很弱 参考。如果我像这样创建地图:
data = new MapMaker()
.softValues()
.makeMap();
条目永远不会被垃圾收集,并且内存不足 我的测试程序出错。 TestValue 条目的 Finalize 方法 从未被调用过。如果我将测试方法更改为以下内容:
public void test() {
int i = 0;
while (true) {
for (final Entry<Integer, TestValue> entry :
data.entrySet()) {
if (entry.getValue() == null) {
data.remove(entry.getKey());
}
}
System.out.println("Etntries: " + data.size() + " heap: "
+ memoryBean.getHeapMemoryUsage() + " non-heap: "
+ memoryBean.getNonHeapMemoryUsage());
for (int j = 0; j < 10; j++) {
i++;
TestValue t = new TestValue(i);
data.put(i, t);
}
try {
Thread.sleep(100);
} catch (InterruptedException ex) {
}
}
}
条目将从缓存中删除,并在 TestValue 上删除终结器 调用了对象,但过了一会儿我也遇到内存不足的情况 错误。
所以我的问题是:使用 MapMaker 创建地图的正确方法是什么? 可以用作缓存的地图吗?为什么我的测试程序没有删除 如果我使用weakKeys,尽快输入条目?是否可以 将引用队列添加到缓存映射?
I want to use MapMaker to create a map that caches large objects,
which should be removed from the cache if there is not enough memory.
This little demo program seems to work fine:
public class TestValue {
private final int id;
private final int[] data = new int[100000];
public TestValue(int id) {
this.id = id;
}
@Override
protected void finalize() throws Throwable {
super.finalize();
System.out.println("finalized");
}
}
public class Main {
private ConcurrentMap<Integer, TestValue> cache;
MemoryMXBean memoryBean;
public Main() {
cache = new MapMaker()
.weakKeys()
.softValues()
.makeMap();
memoryBean = ManagementFactory.getMemoryMXBean();
}
public void test() {
int i = 0;
while (true) {
System.out.println("Etntries: " + cache.size() + " heap: "
+ memoryBean.getHeapMemoryUsage() + " non-heap: "
+ memoryBean.getNonHeapMemoryUsage());
for (int j = 0; j < 10; j++) {
i++;
TestValue t = new TestValue(i);
cache.put(i, t);
}
try {
Thread.sleep(100);
} catch (InterruptedException ex) {
}
}
}
/**
* @param args the command line arguments
*/
public static void main(String[] args) {
Main m = new Main();
m.test();
}
}
However, when I do the same thing in my real application, entries are
basically removed from the cache as soon as they are added. In my real
application, I also use integers as keys, and the cached values are
archive blocks read from disk that contains some data. As far as I
understand, weak-references are garbage-collected as soon as they are
no longer used, so this seems to make sense because the keys are weak
references. If I create the map like this:
data = new MapMaker()
.softValues()
.makeMap();
The entries are never garbage-collected and I get an out-of-memory
error in my test program. The finalize method on the TestValue entries
is never called. If I change the test method to the following:
public void test() {
int i = 0;
while (true) {
for (final Entry<Integer, TestValue> entry :
data.entrySet()) {
if (entry.getValue() == null) {
data.remove(entry.getKey());
}
}
System.out.println("Etntries: " + data.size() + " heap: "
+ memoryBean.getHeapMemoryUsage() + " non-heap: "
+ memoryBean.getNonHeapMemoryUsage());
for (int j = 0; j < 10; j++) {
i++;
TestValue t = new TestValue(i);
data.put(i, t);
}
try {
Thread.sleep(100);
} catch (InterruptedException ex) {
}
}
}
entries are removed from the cache and the finalizer on the TestValue
objects is called, but after a while I also get an out-of-memory
error.
So my question is: what is the right way to use MapMaker to create a
map that can be used as a cache? Why does my test program not remove
the entries as soon as possible if I use weakKeys? Is it possible to
add a reference queue to the cache map?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
可能会发生很多事情,但是对于使用软值的测试程序:即使您有尚未被垃圾收集的 SoftReference,您也可能会遇到 OutOfMemoryError。值得重复的是:即使您有尚未清除的软引用,您也可能会收到 OutOfMemoryError 错误。
软引用有点奇怪,请参阅 http://jeremymanson .blogspot.com/2009/07/how-hotspot-decides-to-clear_07.html 了解当前机制的描述。可能在您的测试用例中,GC 只是没有时间执行两次完整的 GC。
当您使用weakKeys时,CG会立即清除它们,而不必等待完整的GC暂停。 (b/c WeakReferences 被积极地收集。)
在我看来,如果你想要一个带有 Integer 键的内存敏感缓存,我认为以下是合适的:
你可以轻松地制作一个抛出 OutOfMemoryError 的测试程序,但如果你的真实应用程序表现得很好,并且没有承受太大的压力,你可能没问题。软引用很难正确使用。
如果您需要使用 System.gc() 避免内存不足,我建议您切换到具有固定最大大小的 LRU 映射(有关示例,请参阅 java.util.LinkedHashMap 的 javadoc。)它不是并发的,但我预计它最终会给您带来更好的吞吐量,而不是要求系统多次进行完全暂停垃圾收集。
哦,最后要注意的是关于整数键和weakKeys():当使用弱键或软键时,MapMaker 对键使用身份比较,而这很难正确完成。见证以下内容:
祝你好运。
There are a lot of things which might be going on, but with respect to your test program using soft values: you can get OutOfMemoryError even if you have SoftReferences which have not yet been garbage collected. That bears repeating: you can get an OutOfMemoryError even if you have SoftReferences which have not yet be cleared.
SoftReferences are a little weird, see http://jeremymanson.blogspot.com/2009/07/how-hotspot-decides-to-clear_07.html for a description of current mechanics. Likely in your test case, the GC just didn't have time to do two full GCs.
When you were using weakKeys, the CG cleared them right away, and didn't have to wait for a full GC pause. (b/c WeakReferences are collected aggressively.)
In my opinion, if you want a memory-sensitive cache with Integer keys, I'd think the following is appropriate:
You can easily make a test program which throws OutOfMemoryError, but if your real application is somewhat well behaved, and doesn't get under too much pressure, you might be OK. SoftReferences are pretty hard to get right.
If you need to use System.gc() avoid out-of-memory, I would instead recommend you switch to an LRU map with a fixed max size (See the javadoc of java.util.LinkedHashMap for an example.) It's not concurrent, but I expect it's going to give you better throughput in the end than asking the system to do a full-pause garbage collection a bunch of extra times.
Oh, and a final note about integer keys and weakKeys(): MapMaker uses identity comparison for keys when using weak or soft keys, and that's pretty hard to do correctly. Witness the following:
Good luck.
弱键似乎是一个错误。尝试使用强键,因为它们是整数。
Weak keys seems like a mistake. Try using strong keys since they are integers.
我想提请您注意Suppliers.memoizeWithExpirationm,即时缓存。
http://guava-libraries.googlecode.com/svn/trunk/javadoc/com/google/common/base/Suppliers.html#memoizeWithExpiration(com.google.common.base.Supplier,长,java.util.concurrent.TimeUnit)
I'd like to bring your attention to Suppliers.memoizeWithExpirationm, instant-cache.
http://guava-libraries.googlecode.com/svn/trunk/javadoc/com/google/common/base/Suppliers.html#memoizeWithExpiration(com.google.common.base.Supplier, long, java.util.concurrent.TimeUnit)