HashMap 已损坏/性能问题
目前我已经实现了 HashMap,其中
private static Map<String, Item> cached = new HashMap<String, Item>();
Item 是一个具有属性的对象 日期过期时间和 byte[] 数据
当多个线程同时开始命中此映射时,将使用此映射。 我做的检查是
1.
public static final byte[] getCachedData(HttpServletRequest request) throws ServletException
{
String url = getFullURL(request);
Map<String, Item> cache = getCache(request); // this chec
Item item = null;
synchronized (cache)
{
item = cache.get(url);
if (null == item)
return null;
// Make sure that it is not over an hour old.
if (item.expirationTime.getTime() < System.currentTimeMillis())
{
cache.remove(url);
item = null;
}
}
if (null == item)
{
log.info("Expiring Item: " + url);
return null;
}
return item.data;
}
2.如果数据返回null,那么我们创建数据并将其缓存在hashMap中
public static void cacheDataX(HttpServletRequest request, byte[] data, Integer minutes) throws ServletException
{
Item item = new Item(data);
String url = getFullURL(request);
Map<String, Item> cache = getCache(request);
log.info("Caching Item: " + url + " - Bytes: " + data.length);
synchronized (cache)
{
Calendar cal = Calendar.getInstance();
cal.add(Calendar.MINUTE, minutes);
item.expirationTime = cal.getTime();
cache.put(url, item);
}
}
似乎如果多个线程访问say key(在本例中为url),那么数据会多次添加到缓存中在相同的关键位置[因为 getCacheData 将为多个线程返回 null,因为 hashmap 尚未完成第一个线程的数据写入]
关于如何解决该问题有什么建议吗?
Currently I have HashMap implemented which
private static Map<String, Item> cached = new HashMap<String, Item>();
and Item is a object with properties
Date expirationTime , and byte[] data
This map is used when multiple threads concurrently start hitting this.
The check I do is
1.
public static final byte[] getCachedData(HttpServletRequest request) throws ServletException
{
String url = getFullURL(request);
Map<String, Item> cache = getCache(request); // this chec
Item item = null;
synchronized (cache)
{
item = cache.get(url);
if (null == item)
return null;
// Make sure that it is not over an hour old.
if (item.expirationTime.getTime() < System.currentTimeMillis())
{
cache.remove(url);
item = null;
}
}
if (null == item)
{
log.info("Expiring Item: " + url);
return null;
}
return item.data;
}
2. If data is returned null, then we create and data and cache it in hashMap
public static void cacheDataX(HttpServletRequest request, byte[] data, Integer minutes) throws ServletException
{
Item item = new Item(data);
String url = getFullURL(request);
Map<String, Item> cache = getCache(request);
log.info("Caching Item: " + url + " - Bytes: " + data.length);
synchronized (cache)
{
Calendar cal = Calendar.getInstance();
cal.add(Calendar.MINUTE, minutes);
item.expirationTime = cal.getTime();
cache.put(url, item);
}
}
Seems like if multiple threads access the say key (url in this case) , then data gets added to cache more than once at same key location [ as getCacheData will return null for multiple threads since hashmap has not finished writing data for first thread ]
Any suggestions as how to solve the issue ?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
在cacheDataX 中,添加之前检查该项目是否存在(在同步块内)。
这将确保已经完成查找并返回 null 的多个线程不能都将相同的数据添加到缓存中。一个线程将添加它,而其他线程将默默地忽略,因为缓存已被更新。
In cacheDataX add a check for the existence of the item before you add (inside of the synchronized block).
This will ensure that multiple threads that have already done a lookup and returned null cannot all add the same data to the cache. One will add it and others threads will silently ignore since the cache has already been updated.
您需要一个同步块来涵盖从缓存中获取某些内容以及插入到缓存中。就代码而言,存在竞争条件:多个线程可以在任何人执行步骤 2 之前执行步骤 1。
You need one synchronize block, to cover both getting something from the cache plus inserting into the cache. As the code stands you have a race condition: multiple threads can execute step 1 before anybody executes step 2.