MemoryCache 不遵守配置中的内存限制

发布于 2024-11-27 09:42:57 字数 5018 浏览 2 评论 0原文

我正在使用 .NET 4.0 MemoryCache 应用程序中的类并尝试限制最大缓存大小,但在我的测试中,缓存实际上并未遵守限制。

我正在使用的设置 根据 MSDN,应该限制缓存大小:

  1. CacheMemoryLimitMegabytes:对象实例可以增长到的最大内存大小(以兆字节为单位)。”
  2. PhysicalMemoryLimitPercentage: "缓存可以使用的物理内存的百分比,表示为 1 到 100 之间的整数值。默认为零,表示 MemoryCache 实例根据计算机上安装的内存量管理自己的内存1。” 1.这并不完全正确 - 任何低于 4 的值都会被忽略并替换为 4。

我知道这些值是近似值,而不是硬性限制,因为清除缓存的线程每 x 秒触发一次,并且还取决于轮询间隔和其他未记录的变量。然而,即使考虑到这些差异,当在设置 CacheMemoryLimitMegabytesPhysicalMemoryLimitPercentage 一起或单独设置后,从缓存中逐出第一个项目时,我发现缓存大小非常不一致。测试应用程序。可以肯定的是,我将每个测试运行了 10 次并计算了平均值。

这些是在具有 3GB RAM 的 32 位 Windows 7 PC 上测试以下示例代码的结果。每次测试中第一次调用 CacheItemRemoved() 后都会获取缓存的大小。 (我知道缓存的实际大小会比这个大)

MemLimitMB    MemLimitPct     AVG Cache MB on first expiry    
   1            NA              84
   2            NA              84
   3            NA              84
   6            NA              84
  NA             1              84
  NA             4              84
  NA            10              84
  10            20              81
  10            30              81
  10            39              82
  10            40              79
  10            49              146
  10            50              152
  10            60              212
  10            70              332
  10            80              429
  10           100              535
 100            39              81
 500            39              79
 900            39              83
1900            39              84
 900            41              81
 900            46              84

 900            49              1.8 GB approx. in task manager no mem errros
 200            49              156
 100            49              153
2000            60              214
   5            60              78
   6            60              76
   7           100              82
  10           100              541

这是测试应用程序:

using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Linq;
using System.Runtime.Caching;
using System.Text;
namespace FinalCacheTest
{       
    internal class Cache
    {
        private Object Statlock = new object();
        private int ItemCount;
        private long size;
        private MemoryCache MemCache;
        private CacheItemPolicy CIPOL = new CacheItemPolicy();

        public Cache(long CacheSize)
        {
            CIPOL.RemovedCallback = new CacheEntryRemovedCallback(CacheItemRemoved);
            NameValueCollection CacheSettings = new NameValueCollection(3);
            CacheSettings.Add("CacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); 
            CacheSettings.Add("physicalMemoryLimitPercentage", Convert.ToString(49));  //set % here
            CacheSettings.Add("pollingInterval", Convert.ToString("00:00:10"));
            MemCache = new MemoryCache("TestCache", CacheSettings);
        }

        public void AddItem(string Name, string Value)
        {
            CacheItem CI = new CacheItem(Name, Value);
            MemCache.Add(CI, CIPOL);

            lock (Statlock)
            {
                ItemCount++;
                size = size + (Name.Length + Value.Length * 2);
            }

        }

        public void CacheItemRemoved(CacheEntryRemovedArguments Args)
        {
            Console.WriteLine("Cache contains {0} items. Size is {1} bytes", ItemCount, size);

            lock (Statlock)
            {
                ItemCount--;
                size = size - 108;
            }

            Console.ReadKey();
        }
    }
}

namespace FinalCacheTest
{
    internal class Program
    {
        private static void Main(string[] args)
        {
            int MaxAdds = 5000000;
            Cache MyCache = new Cache(1); // set CacheMemoryLimitMegabytes

            for (int i = 0; i < MaxAdds; i++)
            {
                MyCache.AddItem(Guid.NewGuid().ToString(), Guid.NewGuid().ToString());
            }

            Console.WriteLine("Finished Adding Items to Cache");
        }
    }
}

为什么MemoryCache不遵守配置的内存限制?

I’m working with the .NET 4.0 MemoryCache class in an application and trying to limit the maximum cache size, but in my tests it does not appear that the cache is actually obeying the limits.

I'm using the settings which, according to MSDN, are supposed to limit the cache size:

  1. CacheMemoryLimitMegabytes: The maximum memory size, in megabytes, that an instance of an object can grow to."
  2. PhysicalMemoryLimitPercentage: "The percentage of physical memory that the cache can use, expressed as an integer value from 1 to 100. The default is zero, which indicates that MemoryCache instances manage their own memory1 based on the amount of memory that is installed on the computer." 1. This is not entirely correct-- any value below 4 is ignored and replaced with 4.

I understand that these values are approximate and not hard limits as the thread that purges the cache is fired every x seconds and is also dependent on the polling interval and other undocumented variables. However even taking into account these variances, I'm seeing wildly inconsistent cache sizes when the first item is being evicted from the cache after setting CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage together or singularly in a test app. To be sure I ran each test 10 times and calculated the average figure.

These are the results of testing the example code below on a 32-bit Windows 7 PC with 3GB of RAM. Size of the cache is taken after the first call to CacheItemRemoved() on each test. (I am aware the actual size of cache will be larger than this)

MemLimitMB    MemLimitPct     AVG Cache MB on first expiry    
   1            NA              84
   2            NA              84
   3            NA              84
   6            NA              84
  NA             1              84
  NA             4              84
  NA            10              84
  10            20              81
  10            30              81
  10            39              82
  10            40              79
  10            49              146
  10            50              152
  10            60              212
  10            70              332
  10            80              429
  10           100              535
 100            39              81
 500            39              79
 900            39              83
1900            39              84
 900            41              81
 900            46              84

 900            49              1.8 GB approx. in task manager no mem errros
 200            49              156
 100            49              153
2000            60              214
   5            60              78
   6            60              76
   7           100              82
  10           100              541

Here is the test application:

using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Linq;
using System.Runtime.Caching;
using System.Text;
namespace FinalCacheTest
{       
    internal class Cache
    {
        private Object Statlock = new object();
        private int ItemCount;
        private long size;
        private MemoryCache MemCache;
        private CacheItemPolicy CIPOL = new CacheItemPolicy();

        public Cache(long CacheSize)
        {
            CIPOL.RemovedCallback = new CacheEntryRemovedCallback(CacheItemRemoved);
            NameValueCollection CacheSettings = new NameValueCollection(3);
            CacheSettings.Add("CacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); 
            CacheSettings.Add("physicalMemoryLimitPercentage", Convert.ToString(49));  //set % here
            CacheSettings.Add("pollingInterval", Convert.ToString("00:00:10"));
            MemCache = new MemoryCache("TestCache", CacheSettings);
        }

        public void AddItem(string Name, string Value)
        {
            CacheItem CI = new CacheItem(Name, Value);
            MemCache.Add(CI, CIPOL);

            lock (Statlock)
            {
                ItemCount++;
                size = size + (Name.Length + Value.Length * 2);
            }

        }

        public void CacheItemRemoved(CacheEntryRemovedArguments Args)
        {
            Console.WriteLine("Cache contains {0} items. Size is {1} bytes", ItemCount, size);

            lock (Statlock)
            {
                ItemCount--;
                size = size - 108;
            }

            Console.ReadKey();
        }
    }
}

namespace FinalCacheTest
{
    internal class Program
    {
        private static void Main(string[] args)
        {
            int MaxAdds = 5000000;
            Cache MyCache = new Cache(1); // set CacheMemoryLimitMegabytes

            for (int i = 0; i < MaxAdds; i++)
            {
                MyCache.AddItem(Guid.NewGuid().ToString(), Guid.NewGuid().ToString());
            }

            Console.WriteLine("Finished Adding Items to Cache");
        }
    }
}

Why is MemoryCache not obeying the configured memory limits?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

墨落成白 2024-12-04 09:42:57

哇,所以我花了太多时间在带有反射器的 CLR 中进行挖掘,但我想我终于很好地掌握了这里发生的事情。

设置读取正确,但 CLR 本身似乎存在一个根深蒂固的问题,看起来它会使内存限制设置基本上毫无用处。

反映在 System.Runtime.Caching DLL 中,用于 CacheMemoryMonitor 类(有一个类似的类,用于监视物理内存并处理其他设置,但这是更重要的一个):

protected override int GetCurrentPressure()
{
  int num = GC.CollectionCount(2);
  SRef ref2 = this._sizedRef;
  if ((num != this._gen2Count) && (ref2 != null))
  {
    this._gen2Count = num;
    this._idx ^= 1;
    this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow;
    this._cacheSizeSamples[this._idx] = ref2.ApproximateSize;
    IMemoryCacheManager manager = s_memoryCacheManager;
    if (manager != null)
    {
      manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache);
    }
  }
  if (this._memoryLimit <= 0L)
  {
    return 0;
  }
  long num2 = this._cacheSizeSamples[this._idx];
  if (num2 > this._memoryLimit)
  {
    num2 = this._memoryLimit;
  }
  return (int) ((num2 * 100L) / this._memoryLimit);
}

以下代码 值得注意的是,它甚至不会尝试查看缓存的大小,直到 Gen2 垃圾回收之后,而只是回退到 cacheSizeSamples 中现有的存储大小值。所以你永远无法直接击中目标,但如果其余部分有效,我们至少可以在遇到真正的麻烦之前测量一下尺寸。

因此,假设发生了 Gen2 GC,我们就会遇到问题 2,即 ref2.ApproximateSize 在实际近似缓存大小方面做得很糟糕。通过 CLR 垃圾,我发现这是一个 System.SizedReference,这就是它获取值的方式(IntPtr 是 MemoryCache 对象本身的句柄):

[SecurityCritical]
[MethodImpl(MethodImplOptions.InternalCall)]
private static extern long GetApproximateSizeOfSizedRef(IntPtr h);

我假设 extern 声明意味着它会进入非托管状态windows 此时登陆,我不知道如何开始了解它在那里做什么。根据我的观察,它在尝试近似整体大小方面做得很糟糕。

第三件值得注意的事情是对 manager.UpdateCacheSize 的调用,这听起来像是应该做一些事情。不幸的是,在任何正常的示例中,s_memoryCacheManager 将始终为空。该字段是从公共静态成员 ObjectCache.Host 设置的。如果用户愿意的话,这是可以暴露给用户的,而且我实际上能够通过将我自己的 IMemoryCacheManager 实现放在一起,将其设置为 ObjectCache.Host,然后运行示例来使这件事像预期的那样工作。 。不过,在这一点上,您似乎也可以制作自己的缓存实现,甚至不用理会所有这些东西,特别是因为我不知道是否将您自己的类设置为 ObjectCache.Host (静态,因此它会影响每个测量缓存可能会搞乱其他事情。

我必须相信,至少其中一部分(如果不是几个部分)只是一个直接的错误。很高兴能听到 MS 的人告诉我们这件事的处理情况。

这个巨大答案的 TLDR 版本:假设 CacheMemoryLimitMegabytes 在此时完全失效。您可以将其设置为 10 MB,然后继续将缓存填充到 ~2GB,并抛出内存不足异常,而不会触发项目删除。

Wow, so I just spent entirely too much time digging around in the CLR with reflector, but I think I finally have a good handle on what's going on here.

The settings are being read in correctly, but there seems to be a deep-seated problem in the CLR itself that looks like it will render the memory limit setting essentially useless.

The following code is reflected out of the System.Runtime.Caching DLL, for the CacheMemoryMonitor class (there is a similar class that monitors physical memory and deals with the other setting, but this is the more important one):

protected override int GetCurrentPressure()
{
  int num = GC.CollectionCount(2);
  SRef ref2 = this._sizedRef;
  if ((num != this._gen2Count) && (ref2 != null))
  {
    this._gen2Count = num;
    this._idx ^= 1;
    this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow;
    this._cacheSizeSamples[this._idx] = ref2.ApproximateSize;
    IMemoryCacheManager manager = s_memoryCacheManager;
    if (manager != null)
    {
      manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache);
    }
  }
  if (this._memoryLimit <= 0L)
  {
    return 0;
  }
  long num2 = this._cacheSizeSamples[this._idx];
  if (num2 > this._memoryLimit)
  {
    num2 = this._memoryLimit;
  }
  return (int) ((num2 * 100L) / this._memoryLimit);
}

The first thing you might notice is that it doesn't even try to look at the size of the cache until after a Gen2 garbage collection, instead just falling back on the existing stored size value in cacheSizeSamples. So you won't ever be able to hit the target right on, but if the rest worked we would at least get a size measurement before we got in real trouble.

So assuming a Gen2 GC has occurred, we run into problem 2, which is that ref2.ApproximateSize does a horrible job of actually approximating the size of the cache. Slogging through CLR junk I found that this is a System.SizedReference, and this is what it's doing to get the value (IntPtr is a handle to the MemoryCache object itself):

[SecurityCritical]
[MethodImpl(MethodImplOptions.InternalCall)]
private static extern long GetApproximateSizeOfSizedRef(IntPtr h);

I'm assuming that extern declaration means that it goes diving into unmanaged windows land at this point, and I have no idea how to start finding out what it does there. From what I've observed though it does a horrible job of trying to approximate the size of the overall thing.

The third noticeable thing there is the call to manager.UpdateCacheSize which sounds like it should do something. Unfortunately in any normal sample of how this should work s_memoryCacheManager will always be null. The field is set from the public static member ObjectCache.Host. This is exposed for the user to mess with if he so chooses, and I was actually able to make this thing sort of work like it's supposed to by slopping together my own IMemoryCacheManager implementation, setting it to ObjectCache.Host, and then running the sample. At that point though, it seems like you might as well just make your own cache implementation and not even bother with all this stuff, especially since I have no idea if setting your own class to ObjectCache.Host (static, so it affects every one of these that might be out there in process) to measure the cache could mess up other things.

I have to believe that at least part of this (if not a couple parts) is just a straight up bug. It'd be nice to hear from someone at MS what the deal was with this thing.

TLDR version of this giant answer: Assume that CacheMemoryLimitMegabytes is completely busted at this point in time. You can set it to 10 MB, and then proceed to fill up the cache to ~2GB and blow an out of memory exception with no tripping of item removal.

虚拟世界 2024-12-04 09:42:57

我知道这个答案太晚了,但迟到总比不好。我想让您知道,我编写了一个 MemoryCache 版本,可以自动为您解决 Gen 2 Collection 问题。因此,只要轮询间隔表明内存压力,它就会进行调整。如果您遇到此问题,请尝试一下!

http://www.nuget.org/packages/SharpMemoryCache

如果满足以下条件,您还可以在 GitHub 上找到它:你很好奇我是如何解决这个问题的。代码有点简单。

https://github.com/haneytron/sharpmemorycache

I know this answer is crazy late, but better late than never. I wanted to let you know that I wrote a version of MemoryCache that resolves the Gen 2 Collection issues automatically for you. It therefore trims whenever the polling interval indicates memory pressure. If you're experiencing this issue, give it a go!

http://www.nuget.org/packages/SharpMemoryCache

You can also find it on GitHub if you're curious about how I solved it. The code is somewhat simple.

https://github.com/haneytron/sharpmemorycache

别挽留 2024-12-04 09:42:57

我也遇到过这个问题。我正在缓存每秒数十次被发射到我的进程中的对象。

我发现以下配置和用法大部分时间每 5 秒释放一次项目。

App.config:

记下cacheMemoryLimitMegabytes。当此值设置为零时,清除例程将不会在合理的时间内触发。

   <system.runtime.caching>
    <memoryCache>
      <namedCaches>
        <add name="Default" cacheMemoryLimitMegabytes="20" physicalMemoryLimitPercentage="0" pollingInterval="00:00:05" />
      </namedCaches>
    </memoryCache>
  </system.runtime.caching>  

添加到缓存:

MemoryCache.Default.Add(someKeyValue, objectToCache, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddSeconds(5), RemovedCallback = cacheItemRemoved });

确认缓存删除正在工作:

void cacheItemRemoved(CacheEntryRemovedArguments arguments)
{
    System.Diagnostics.Debug.WriteLine("Item removed from cache: {0} at {1}", arguments.CacheItem.Key, DateTime.Now.ToString());
}

I've encountered this issue as well. I'm caching objects that are being fired into my process dozens of times per second.

I have found the following configuration and usage frees the items every 5 seconds most of the time.

App.config:

Take note of cacheMemoryLimitMegabytes. When this was set to zero, the purging routine would not fire in a reasonable time.

   <system.runtime.caching>
    <memoryCache>
      <namedCaches>
        <add name="Default" cacheMemoryLimitMegabytes="20" physicalMemoryLimitPercentage="0" pollingInterval="00:00:05" />
      </namedCaches>
    </memoryCache>
  </system.runtime.caching>  

Adding to cache:

MemoryCache.Default.Add(someKeyValue, objectToCache, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddSeconds(5), RemovedCallback = cacheItemRemoved });

Confirming the cache removal is working:

void cacheItemRemoved(CacheEntryRemovedArguments arguments)
{
    System.Diagnostics.Debug.WriteLine("Item removed from cache: {0} at {1}", arguments.CacheItem.Key, DateTime.Now.ToString());
}
怪异←思 2024-12-04 09:42:57

我已经用@Canacourse 的示例和@woany 的修改做了一些测试,我认为有一些关键调用会阻止内存缓存的清理。

public void CacheItemRemoved(CacheEntryRemovedArguments Args)
{
    // this WriteLine() will block the thread of
    // the MemoryCache long enough to slow it down,
    // and it will never catch up the amount of memory
    // beyond the limit
    Console.WriteLine("...");

    // ...

    // this ReadKey() will block the thread of 
    // the MemoryCache completely, till you press any key
    Console.ReadKey();
}

但为什么@woany的修改似乎使内存保持在同一水平?首先,RemovedCallback 未设置,并且没有控制台输出或等待输入,这可能会阻塞内存缓存的线程。

其次...

public void AddItem(string Name, string Value)
{
    // ...

    // this WriteLine will block the main thread long enough,
    // so that the thread of the MemoryCache can do its work more frequently
    Console.WriteLine("...");
}

每约 1000 个 AddItem() 执行一次 Thread.Sleep(1) 会产生相同的效果。

好吧,这并不是对问题的深入调查,但看起来好像 MemoryCache 的线程没有获得足够的 CPU 时间来进行清理,同时添加了许多新元素。

I have done some testing with the example of @Canacourse and the modification of @woany and I think there are some critical calls that block the cleaning of the memory cache.

public void CacheItemRemoved(CacheEntryRemovedArguments Args)
{
    // this WriteLine() will block the thread of
    // the MemoryCache long enough to slow it down,
    // and it will never catch up the amount of memory
    // beyond the limit
    Console.WriteLine("...");

    // ...

    // this ReadKey() will block the thread of 
    // the MemoryCache completely, till you press any key
    Console.ReadKey();
}

But why does the modification of @woany seems to keep the memory at the same level? Firstly, the RemovedCallback is not set and there is no console output or waiting for input that could block the thread of the memory cache.

Secondly...

public void AddItem(string Name, string Value)
{
    // ...

    // this WriteLine will block the main thread long enough,
    // so that the thread of the MemoryCache can do its work more frequently
    Console.WriteLine("...");
}

A Thread.Sleep(1) every ~1000th AddItem() would have the same effect.

Well, it's not a very deep investigation of the problem, but it looks as if the thread of the MemoryCache does not get enough CPU time for cleaning, while many new elements are added.

婴鹅 2024-12-04 09:42:57

我(幸运的是)昨天第一次尝试使用 MemoryCache 时偶然发现了这篇有用的文章。我认为这将是设置值和使用类的简单情况,但我遇到了上面概述的类似问题。为了尝试看看发生了什么,我使用 ILSpy 提取了源代码,然后设置了测试并逐步执行代码。我的测试代码与上面的代码非常相似,所以我不会发布它。从我的测试中,我注意到缓存大小的测量从来都不是特别准确(如上所述),并且考虑到当前的实现永远不会可靠地工作。然而,物理测量很好,如果每次轮询都测量物理内存,那么在我看来,代码会可靠地工作。因此,我删除了 MemoryCacheStatistics 中的第 2 代垃圾收集检查;在正常情况下,不会进行内存测量,除非自上次测量以来又进行了第 2 代垃圾回收。

在测试场景中,这显然会产生很大的差异,因为缓存不断被命中,因此对象永远没有机会到达第 2 代。我认为我们将在我们的项目中使用此 dll 的修改版本,并使用官方 MS当 .net 4.5 发布时构建(根据上面提到的连接文章应该有修复)。从逻辑上讲,我可以理解为什么要进行第 2 代检查,但实际上我不确定它是否有意义。如果内存达到 90%(或已设置的任何限制),则无论是否发生第 2 代收集都无关紧要,无论如何,项目都应被逐出。

我让测试代码运行了大约 15 分钟,并将physicalMemoryLimitPercentage 设置为 65%。我看到测试期间内存使用率保持在 65-68% 之间,并且看到事物被正确驱逐。在我的测试中,我将 pollingInterval 设置为 5 秒,将physicalMemoryLimitPercentage 设置为 65,将physicalMemoryLimitPercentage 设置为 0 以默认此设置。

遵循上述建议; IMemoryCacheManager 的实现可以从缓存中逐出内容。然而,它会受到提到的第 2 代检查问题的影响。不过,根据具体情况,这在生产代码中可能不是问题,并且可能对人们来说足够有效。

I (thankfully) stumbled across this useful post yesterday when first attempting to use the MemoryCache. I thought it would be a simple case of setting values and using the classes but I encountered similar issues outlined above. To try and see what was going on I extracted the source using ILSpy and then set up a test and stepped through the code. My test code was very similar to the code above so I won't post it. From my tests I noticed that the measurement of the cache size was never particularly accurate (as mentioned above) and given the current implementation would never work reliably. However the physical measurement was fine and if the physical memory was measured at every poll then it seemed to me like the code would work reliably. So, I removed the gen 2 garbage collection check within MemoryCacheStatistics; under normal conditions no memory measurements will be taken unless there has been another gen 2 garbage collection since the last measurement.

In a test scenario this obviously makes a big difference as the cache is being hit constantly so objects never have the chance to get to gen 2. I think we are going to use the modified build of this dll on our project and use the official MS build when .net 4.5 comes out (which according to the connect article mentioned above should have the fix in it). Logically I can see why the gen 2 check has been put in place but in practise I'm not sure if it makes much sense. If the memory reaches 90% (or whatever limit it has been set to) then it should not matter if a gen 2 collection has occured or not, items should be evicted regardless.

I left my test code running for about 15 minutes with a the physicalMemoryLimitPercentage set to 65%. I saw the memory usage remain between 65-68% during the test and saw things getting evicted properly. In my test I set the pollingInterval to 5 seconds, physicalMemoryLimitPercentage to 65 and physicalMemoryLimitPercentage to 0 to default this.

Following the above advice; an implementation of IMemoryCacheManager could be made to evict things from the cache. It would however suffer from the gen 2 check issue mentioned. Although, depending on the scenario, this may not be a problem in production code and may work sufficiently for people.

悟红尘 2024-12-04 09:42:57

事实证明这不是一个错误,你需要做的就是设置池化时间跨度来强制执行限制,似乎如果你不设置池化,它永远不会触发。我刚刚测试了它,不需要包装器或任何额外的代码:

 private static readonly NameValueCollection Collection = new NameValueCollection
        {
            {"CacheMemoryLimitMegabytes", "20"},
           {"PollingInterval", TimeSpan.FromMilliseconds(60000).ToString()}, // this will check the limits each 60 seconds

        };

根据缓存增长的速度设置“PollingInterval”的值,如果增长太快,则增加轮询检查的频率,否则保持检查不那么频繁,以免造成开销。

It turned out it is not a bug , all what you need to do is setting the pooling time span to enforce the limits , it seem if you leave the pooling not set, it will never trigger.I just tested it and no need to wrappers or any extra code :

 private static readonly NameValueCollection Collection = new NameValueCollection
        {
            {"CacheMemoryLimitMegabytes", "20"},
           {"PollingInterval", TimeSpan.FromMilliseconds(60000).ToString()}, // this will check the limits each 60 seconds

        };

Set the value of "PollingInterval" based on how fast the cache is growing , if it grow too fast increase the frequency of polling checks otherwise keep the checks not very frequent to not cause overhead.

老子叫无熙 2024-12-04 09:42:57

如果您使用以下修改后的类并通过任务管理器监视内存实际上会被修剪:

internal class Cache
{
    private Object Statlock = new object();
    private int ItemCount;
    private long size;
    private MemoryCache MemCache;
    private CacheItemPolicy CIPOL = new CacheItemPolicy();

    public Cache(double CacheSize)
    {
        NameValueCollection CacheSettings = new NameValueCollection(3);
        CacheSettings.Add("cacheMemoryLimitMegabytes", Convert.ToString(CacheSize));
        CacheSettings.Add("pollingInterval", Convert.ToString("00:00:01"));
        MemCache = new MemoryCache("TestCache", CacheSettings);
    }

    public void AddItem(string Name, string Value)
    {
        CacheItem CI = new CacheItem(Name, Value);
        MemCache.Add(CI, CIPOL);

        Console.WriteLine(MemCache.GetCount());
    }
}

If you use the following modified class and monitor the memory via Task Manager does in fact get trimmed:

internal class Cache
{
    private Object Statlock = new object();
    private int ItemCount;
    private long size;
    private MemoryCache MemCache;
    private CacheItemPolicy CIPOL = new CacheItemPolicy();

    public Cache(double CacheSize)
    {
        NameValueCollection CacheSettings = new NameValueCollection(3);
        CacheSettings.Add("cacheMemoryLimitMegabytes", Convert.ToString(CacheSize));
        CacheSettings.Add("pollingInterval", Convert.ToString("00:00:01"));
        MemCache = new MemoryCache("TestCache", CacheSettings);
    }

    public void AddItem(string Name, string Value)
    {
        CacheItem CI = new CacheItem(Name, Value);
        MemCache.Add(CI, CIPOL);

        Console.WriteLine(MemCache.GetCount());
    }
}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文