IDictionary 是否有 LRU 实现?

发布于 2024-07-16 12:28:10 字数 197 浏览 6 评论 0原文

我想实现一个简单的内存中 LRU 缓存系统,并且我正在考虑一个基于 IDictionary 实现的解决方案,该解决方案可以处理散列 LRU 机制。 来自 java,我有使用 LinkedHashMap 的经验,它可以很好地满足我的需要:我在任何地方都找不到类似的 .NET 解决方案。

有没有人开发过或者有没有人有过这样的经历?

I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism.
Coming from java, I have experiences with LinkedHashMap, which works fine for what I need: I can't find anywhere a similar solution for .NET.

Has anyone developed it or has anyone had experiences like this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(13

傾城如夢未必闌珊 2024-07-23 12:28:10

这是我们为我们拥有的网站开发的一个非常简单且快速的实现。

我们尝试尽可能地改进代码,同时保持线程安全。
我认为代码非常简单明了,但是如果您需要一些解释或与如何使用它相关的指南,请随时询问。

namespace LRUCache
{
    public class LRUCache<K,V>
    {
        private int capacity;
        private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
        private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();

        public LRUCache(int capacity)
        {
            this.capacity = capacity;
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public V get(K key)
        {
            LinkedListNode<LRUCacheItem<K, V>> node;
            if (cacheMap.TryGetValue(key, out node))
            {
                V value = node.Value.value;
                lruList.Remove(node);
                lruList.AddLast(node);
                return value;
            }
            return default(V);
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public void add(K key, V val)
        {
            if (cacheMap.TryGetValue(key, out var existingNode))
            {
                lruList.Remove(existingNode);
            }
            else if (cacheMap.Count >= capacity)
            {
                RemoveFirst();
            }

            LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
            LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
            lruList.AddLast(node);
            // cacheMap.Add(key, node); - here's bug if try to add already existing value
            cacheMap[key] = node;
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
            lruList.RemoveFirst();

            // Remove from cache
            cacheMap.Remove(node.Value.key);
        }
    }

    class LRUCacheItem<K,V>
    {
        public LRUCacheItem(K k, V v)
        {
            key = k;
            value = v;
        }
        public K key;
        public V value;
    }
}

This a very simple and fast implementation we developed for a web site we own.

We tried to improve the code as much as possible, while keeping it thread safe.
I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.

namespace LRUCache
{
    public class LRUCache<K,V>
    {
        private int capacity;
        private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
        private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();

        public LRUCache(int capacity)
        {
            this.capacity = capacity;
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public V get(K key)
        {
            LinkedListNode<LRUCacheItem<K, V>> node;
            if (cacheMap.TryGetValue(key, out node))
            {
                V value = node.Value.value;
                lruList.Remove(node);
                lruList.AddLast(node);
                return value;
            }
            return default(V);
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public void add(K key, V val)
        {
            if (cacheMap.TryGetValue(key, out var existingNode))
            {
                lruList.Remove(existingNode);
            }
            else if (cacheMap.Count >= capacity)
            {
                RemoveFirst();
            }

            LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
            LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
            lruList.AddLast(node);
            // cacheMap.Add(key, node); - here's bug if try to add already existing value
            cacheMap[key] = node;
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
            lruList.RemoveFirst();

            // Remove from cache
            cacheMap.Remove(node.Value.key);
        }
    }

    class LRUCacheItem<K,V>
    {
        public LRUCacheItem(K k, V v)
        {
            key = k;
            value = v;
        }
        public K key;
        public V value;
    }
}
踏雪无痕 2024-07-23 12:28:10

基类库中没有任何内容可以执行此操作。

在免费方面,也许类似于 C5 的 HashedLinkedList< /a> 会起作用。

如果您愿意付费,也许可以查看 此 C# 工具包。 它包含一个实现。

There is nothing in the base class libraries that does this.

On the free side, maybe something like C5's HashedLinkedList would work.

If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.

2024-07-23 12:28:10

上面示例代码的 LRUCache 答案(由 Martin)使用 MethodImplOptions.Synchronized< /code>,相当于在每个方法调用周围放置 lock(this)。 虽然正确,但此全局锁将显着降低并发负载下的吞吐量。

为了解决这个问题,我实现了一个专为并发工作负载设计的线程安全伪 LRU。 性能非常接近 ConcurrentDictionary,比 MemoryCache 快约 10 倍,并且命中率优于传统 LRU。 下面的 GitHub 链接提供了完整的分析。

用法如下:

int capacity = 500;
var lru = new ConcurrentLru<int, SomeItem>(capacity);

var value = lru.GetOrAdd(1, (k) => new SomeItem(k));

GitHub:https://github.com/bitfaster/BitFaster.Caching

Install-Package BitFaster.Caching

The LRUCache answer with sample code above (by Martin) uses MethodImplOptions.Synchronized, which is equivalent to putting lock(this) around each method call. Whilst correct, this global lock will significantly reduce throughput under concurrent load.

To solve this I implemented a thread safe pseudo LRU designed for concurrent workloads. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the GitHub link below.

Usage looks like this:

int capacity = 500;
var lru = new ConcurrentLru<int, SomeItem>(capacity);

var value = lru.GetOrAdd(1, (k) => new SomeItem(k));

GitHub: https://github.com/bitfaster/BitFaster.Caching

Install-Package BitFaster.Caching
薄凉少年不暖心 2024-07-23 12:28:10

我最近发布了一个名为 LurchTable 的类来满足对 LinkedHashMap 的 C# 变体的需求。 LurchTable 的简要讨论可以在这里找到。

基本功能:

  • 通过插入、修改或访问来链接并发
  • 字典 字典/ConcurrentDictionary 接口支持
  • 对“最旧”条目的 Peek/TryDequeue/Dequeue 访问
  • 允许对插入时强制执行的项目进行硬限制
  • 公开用于添加、更新和删除的事件

源代码: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs GitHub

https://github.com/csharptest/CSharpTest.Net.Collections

HTML 帮助:http://help .csharptest.net/

PM> 安装包 CSharpTest.Net.Collections

I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.

Basic features:

  • Linked Concurrent Dictionary by Insertion, Modification, or Access
  • Dictionary/ConcurrentDictionary interface support
  • Peek/TryDequeue/Dequeue access to 'oldest' entry
  • Allows hard-limit on items enforced at insertion
  • Exposes events for add, update, and remove

Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs

GitHub: https://github.com/csharptest/CSharpTest.Net.Collections

HTML Help: http://help.csharptest.net/

PM> Install-Package CSharpTest.Net.Collections

幸福%小乖 2024-07-23 12:28:10

在谷歌搜索时找到了你的答案,还发现了这个:

http://code.google.com /p/csharp-lru-cache/

csharp-lru-cache:LRU缓存集合类库< /p>

这是一个集合类
用作最近最少使用的
缓存。 它实现了ICollection
但还公开了其他三个成员:

  • Capacity,最大项目数
    缓存可以包含。 一旦
    集合已满,添加一个
    缓存中的新项目将导致
    最近最少使用的项目是
    被丢弃。 如果容量设置为 0
    在构造时,缓存不会
    自动丢弃物品。
  • 最旧的
    最旧的(即最近最少使用的)
    集合中的项目。
  • DiscardingOldestItem,引发的事件
    当缓存即将丢弃它的时候
    最旧的项目。 这是一个极其
    简单的实现。 而其添加
    和Remove方法是线程安全的,它
    不应该用于重型
    多线程环境,因为
    整个集合在期间被锁定
    这些方法。

Found you answer while googling, also found this:

http://code.google.com/p/csharp-lru-cache/

csharp-lru-cache: LRU cache collection class library

This is a collection class that
functions as a least-recently-used
cache. It implements ICollection<T>,
but also exposes three other members:

  • Capacity, the maximum number of items
    the cache can contain. Once the
    collection is at capacity, adding a
    new item to the cache will cause the
    least recently used item to be
    discarded. If the Capacity is set to 0
    at construction, the cache will not
    automatically discard items.
  • Oldest,
    the oldest (i.e. least recently used)
    item in the collection.
  • DiscardingOldestItem, an event raised
    when the cache is about to discard its
    oldest item. This is an extremely
    simple implementation. While its Add
    and Remove methods are thread-safe, it
    shouldn't be used in heavy
    multithreading environments because
    the entire collection is locked during
    those methods.
享受孤独 2024-07-23 12:28:10

EntLib 的缓存应用程序块具有 LRU 清理选项框并可以存储在内存中。 对于你想要的东西来说,它可能有点重。

The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.

清醇 2024-07-23 12:28:10

这需要 Martin 的代码与 T 先生 的建议并使其对 Stylecop 友好。 哦,它还允许在值从缓存中循环出来时对其进行处理。

namespace LruCache
{
    using System;
    using System.Collections.Generic;

    /// <summary>
    /// A least-recently-used cache stored like a dictionary.
    /// </summary>
    /// <typeparam name="TKey">
    /// The type of the key to the cached item
    /// </typeparam>
    /// <typeparam name="TValue">
    /// The type of the cached item.
    /// </typeparam>
    /// <remarks>
    /// Derived from https://stackoverflow.com/a/3719378/240845
    /// </remarks>
    public class LruCache<TKey, TValue>
    {
        private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
            new Dictionary<TKey, LinkedListNode<LruCacheItem>>();

        private readonly LinkedList<LruCacheItem> lruList =
            new LinkedList<LruCacheItem>();

        private readonly Action<TValue> dispose;

        /// <summary>
        /// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
        /// class.
        /// </summary>
        /// <param name="capacity">
        /// Maximum number of elements to cache.
        /// </param>
        /// <param name="dispose">
        /// When elements cycle out of the cache, disposes them. May be null.
        /// </param>
        public LruCache(int capacity, Action<TValue> dispose = null)
        {
            this.Capacity = capacity;
            this.dispose = dispose;
        }

        /// <summary>
        /// Gets the capacity of the cache.
        /// </summary>
        public int Capacity { get; }

        /// <summary>Gets the value associated with the specified key.</summary>
        /// <param name="key">
        /// The key of the value to get.
        /// </param>
        /// <param name="value">
        /// When this method returns, contains the value associated with the specified
        /// key, if the key is found; otherwise, the default value for the type of the 
        /// <paramref name="value" /> parameter. This parameter is passed
        /// uninitialized.
        /// </param>
        /// <returns>
        /// true if the <see cref="T:System.Collections.Generic.Dictionary`2" /> 
        /// contains an element with the specified key; otherwise, false.
        /// </returns>
        public bool TryGetValue(TKey key, out TValue value)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                    return true;
                }

                value = default(TValue);
                return false;
            }
        }

        /// <summary>
        /// Looks for a value for the matching <paramref name="key"/>. If not found, 
        /// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
        /// the cache.
        /// </summary>
        /// <param name="key">
        /// The key of the value to look up.
        /// </param>
        /// <param name="valueGenerator">
        /// Generates a value if one isn't found.
        /// </param>
        /// <returns>
        /// The requested value.
        /// </returns>
        public TValue Get(TKey key, Func<TValue> valueGenerator)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                TValue value;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                }
                else
                {
                    value = valueGenerator();
                    if (this.cacheMap.Count >= this.Capacity)
                    {
                        this.RemoveFirst();
                    }

                    LruCacheItem cacheItem = new LruCacheItem(key, value);
                    node = new LinkedListNode<LruCacheItem>(cacheItem);
                    this.lruList.AddLast(node);
                    this.cacheMap.Add(key, node);
                }

                return value;
            }
        }

        /// <summary>
        /// Adds the specified key and value to the dictionary.
        /// </summary>
        /// <param name="key">
        /// The key of the element to add.
        /// </param>
        /// <param name="value">
        /// The value of the element to add. The value can be null for reference types.
        /// </param>
        public void Add(TKey key, TValue value)
        {
            lock (this.cacheMap)
            {
                if (this.cacheMap.Count >= this.Capacity)
                {
                    this.RemoveFirst();
                }

                LruCacheItem cacheItem = new LruCacheItem(key, value);
                LinkedListNode<LruCacheItem> node = 
                    new LinkedListNode<LruCacheItem>(cacheItem);
                this.lruList.AddLast(node);
                this.cacheMap.Add(key, node);
            }
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LruCacheItem> node = this.lruList.First;
            this.lruList.RemoveFirst();

            // Remove from cache
            this.cacheMap.Remove(node.Value.Key);

            // dispose
            this.dispose?.Invoke(node.Value.Value);
        }

        private class LruCacheItem
        {
            public LruCacheItem(TKey k, TValue v)
            {
                this.Key = k;
                this.Value = v;
            }

            public TKey Key { get; }

            public TValue Value { get; }
        }
    }
}

This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.

namespace LruCache
{
    using System;
    using System.Collections.Generic;

    /// <summary>
    /// A least-recently-used cache stored like a dictionary.
    /// </summary>
    /// <typeparam name="TKey">
    /// The type of the key to the cached item
    /// </typeparam>
    /// <typeparam name="TValue">
    /// The type of the cached item.
    /// </typeparam>
    /// <remarks>
    /// Derived from https://stackoverflow.com/a/3719378/240845
    /// </remarks>
    public class LruCache<TKey, TValue>
    {
        private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
            new Dictionary<TKey, LinkedListNode<LruCacheItem>>();

        private readonly LinkedList<LruCacheItem> lruList =
            new LinkedList<LruCacheItem>();

        private readonly Action<TValue> dispose;

        /// <summary>
        /// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
        /// class.
        /// </summary>
        /// <param name="capacity">
        /// Maximum number of elements to cache.
        /// </param>
        /// <param name="dispose">
        /// When elements cycle out of the cache, disposes them. May be null.
        /// </param>
        public LruCache(int capacity, Action<TValue> dispose = null)
        {
            this.Capacity = capacity;
            this.dispose = dispose;
        }

        /// <summary>
        /// Gets the capacity of the cache.
        /// </summary>
        public int Capacity { get; }

        /// <summary>Gets the value associated with the specified key.</summary>
        /// <param name="key">
        /// The key of the value to get.
        /// </param>
        /// <param name="value">
        /// When this method returns, contains the value associated with the specified
        /// key, if the key is found; otherwise, the default value for the type of the 
        /// <paramref name="value" /> parameter. This parameter is passed
        /// uninitialized.
        /// </param>
        /// <returns>
        /// true if the <see cref="T:System.Collections.Generic.Dictionary`2" /> 
        /// contains an element with the specified key; otherwise, false.
        /// </returns>
        public bool TryGetValue(TKey key, out TValue value)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                    return true;
                }

                value = default(TValue);
                return false;
            }
        }

        /// <summary>
        /// Looks for a value for the matching <paramref name="key"/>. If not found, 
        /// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
        /// the cache.
        /// </summary>
        /// <param name="key">
        /// The key of the value to look up.
        /// </param>
        /// <param name="valueGenerator">
        /// Generates a value if one isn't found.
        /// </param>
        /// <returns>
        /// The requested value.
        /// </returns>
        public TValue Get(TKey key, Func<TValue> valueGenerator)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                TValue value;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                }
                else
                {
                    value = valueGenerator();
                    if (this.cacheMap.Count >= this.Capacity)
                    {
                        this.RemoveFirst();
                    }

                    LruCacheItem cacheItem = new LruCacheItem(key, value);
                    node = new LinkedListNode<LruCacheItem>(cacheItem);
                    this.lruList.AddLast(node);
                    this.cacheMap.Add(key, node);
                }

                return value;
            }
        }

        /// <summary>
        /// Adds the specified key and value to the dictionary.
        /// </summary>
        /// <param name="key">
        /// The key of the element to add.
        /// </param>
        /// <param name="value">
        /// The value of the element to add. The value can be null for reference types.
        /// </param>
        public void Add(TKey key, TValue value)
        {
            lock (this.cacheMap)
            {
                if (this.cacheMap.Count >= this.Capacity)
                {
                    this.RemoveFirst();
                }

                LruCacheItem cacheItem = new LruCacheItem(key, value);
                LinkedListNode<LruCacheItem> node = 
                    new LinkedListNode<LruCacheItem>(cacheItem);
                this.lruList.AddLast(node);
                this.cacheMap.Add(key, node);
            }
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LruCacheItem> node = this.lruList.First;
            this.lruList.RemoveFirst();

            // Remove from cache
            this.cacheMap.Remove(node.Value.Key);

            // dispose
            this.dispose?.Invoke(node.Value.Value);
        }

        private class LruCacheItem
        {
            public LruCacheItem(TKey k, TValue v)
            {
                this.Key = k;
                this.Value = v;
            }

            public TKey Key { get; }

            public TValue Value { get; }
        }
    }
}
淡看悲欢离合 2024-07-23 12:28:10

我不相信是这样。 我当然见过手卷在各种不相关的项目中多次实施(这或多或少证实了这一点。如果有的话,肯定至少有一个项目会使用它)。

实现起来非常简单,通常通过创建一个包含 DictionaryList 的类来完成。

键放入列表(按顺序),项目放入字典。
当您向集合中添加新项目时,该函数会检查列表的长度,取出最后一个键(如果太长),然后从字典中逐出键和值以进行匹配。 其实没什么更多的了

I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).

It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary and a List.

The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really

蹲在坟头点根烟 2024-07-23 12:28:10

我喜欢劳伦斯的实施。 Hashtable + LinkedList 是一个很好的解决方案。

关于线程,我不会使用[MethodImpl(MethodImplOptions.Synchronized)]来锁定它,而是使用ReaderWriterLockSlim或自旋锁(因为争用通常很快)。

Get 函数中,我会首先检查它是否已经是第一项,而不是总是删除和添加。 这使您可以将其保留在不阻塞其他读取器的读取器锁内。

I like Lawrence's implementation. Hashtable + LinkedList is a good solution.

Regarding threading, I would not lock this with[MethodImpl(MethodImplOptions.Synchronized)], but rather use ReaderWriterLockSlim or spin lock (since contention usually fast) instead.

In the Get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you the possibility to keep that within a reader lock that is not blocking other readers.

痴意少年 2024-07-23 12:28:10

下面是适用于 .NET 6 及更高版本的 LRUCache 集合的现代实现。 主要功能是方法GetOrAdd。 此方法要么返回现有值,要么调用 valueFactory 并返回新值。 每次添加新值时,都会通过从集合中逐出最近最少使用的值来强制执行 boundedCapacity 策略。 valueFactory 被延迟调用,以便对同一键的多个并发 GetOrAdd 调用接收相同的值。

public class LRUCache<TKey, TValue> : IEnumerable<KeyValuePair<TKey, TValue>>
{
    private readonly Dictionary<TKey, LinkedListNode<Entry>> _dictionary;
    private readonly LinkedList<Entry> _linkedList;
    private readonly int _boundedCapacity;

    private readonly record struct Entry(TKey Key, Lazy<TValue> Lazy);

    public LRUCache(int boundedCapacity, IEqualityComparer<TKey> comparer = default)
    {
        if (boundedCapacity < 0)
            throw new ArgumentOutOfRangeException(nameof(boundedCapacity));
        _dictionary = new(boundedCapacity + 1, comparer);
        _linkedList = new();
        _boundedCapacity = boundedCapacity;
    }

    private object SyncRoot => _dictionary;
    public int Count { get { lock (SyncRoot) return _dictionary.Count; } }

    public TValue GetOrAdd(TKey key, Func<TKey, TValue> valueFactory)
    {
        ArgumentNullException.ThrowIfNull(valueFactory);
        Lazy<TValue> lazy;
        lock (SyncRoot)
        {
            ref LinkedListNode<Entry> refNode = ref CollectionsMarshal
                .GetValueRefOrAddDefault(_dictionary, key, out bool exists);
            if (exists)
            {
                lazy = refNode.Value.Lazy;
                if (!ReferenceEquals(refNode, _linkedList.Last))
                {
                    _linkedList.Remove(refNode);
                    _linkedList.AddLast(refNode);
                }
            }
            else
            {
                lazy = new(() => valueFactory(key));
                refNode = new(new Entry(key, lazy));
                _linkedList.AddLast(refNode);
                if (_dictionary.Count > _boundedCapacity)
                {
                    bool removed = _dictionary.Remove(_linkedList.First.Value.Key);
                    Debug.Assert(removed);
                    _linkedList.RemoveFirst();
                }
            }
            Debug.Assert(_dictionary.Count == _linkedList.Count);
        }
        return lazy.Value;
    }

    public void Clear()
    {
        lock (SyncRoot)
        {
            _dictionary.Clear();
            _linkedList.Clear();
        }
    }

    public IEnumerator<KeyValuePair<TKey, TValue>> GetEnumerator()
    {
        lock (SyncRoot)
        {
            return _linkedList
                .ToArray()
                .Select((Entry e) => KeyValuePair.Create(e.Key, e.Lazy.Value))
                .AsEnumerable()
                .GetEnumerator();
        }
    }

    IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
}

用法示例:

LRUCache<string, object> cache = new(30);
object value = cache.GetOrAdd("SomeKey", key => GetObject(key));

高级 API CollectionsMarshal使用 .GetValueRefOrAddDefault 以便每次 GetOrAdd 调用仅对 key 进行哈希处理一次。

如果 valueFactory 失败,Lazy类用于永久缓存异常。 此行为可能不适合缓存系统,因此您可能需要将 Lazy 替换为我发布的简单 LazyWithRetry 实现 此处

如果您想使用异步 valueFactory,可以在 这个问题

LRUCache 类是线程安全的。

Here is a modern implementation of a LRUCache<TKey, TValue> collection, for .NET 6 and later. The main feature is the method GetOrAdd. This method either returns an existing value, or invokes the valueFactory and returns a new value. Each time a new value is added, the boundedCapacity policy is enforced by evicting the least recently used value from the collection. The valueFactory is invoked lazily, so that multiple concurrent GetOrAdd calls for the same key receive the same value.

public class LRUCache<TKey, TValue> : IEnumerable<KeyValuePair<TKey, TValue>>
{
    private readonly Dictionary<TKey, LinkedListNode<Entry>> _dictionary;
    private readonly LinkedList<Entry> _linkedList;
    private readonly int _boundedCapacity;

    private readonly record struct Entry(TKey Key, Lazy<TValue> Lazy);

    public LRUCache(int boundedCapacity, IEqualityComparer<TKey> comparer = default)
    {
        if (boundedCapacity < 0)
            throw new ArgumentOutOfRangeException(nameof(boundedCapacity));
        _dictionary = new(boundedCapacity + 1, comparer);
        _linkedList = new();
        _boundedCapacity = boundedCapacity;
    }

    private object SyncRoot => _dictionary;
    public int Count { get { lock (SyncRoot) return _dictionary.Count; } }

    public TValue GetOrAdd(TKey key, Func<TKey, TValue> valueFactory)
    {
        ArgumentNullException.ThrowIfNull(valueFactory);
        Lazy<TValue> lazy;
        lock (SyncRoot)
        {
            ref LinkedListNode<Entry> refNode = ref CollectionsMarshal
                .GetValueRefOrAddDefault(_dictionary, key, out bool exists);
            if (exists)
            {
                lazy = refNode.Value.Lazy;
                if (!ReferenceEquals(refNode, _linkedList.Last))
                {
                    _linkedList.Remove(refNode);
                    _linkedList.AddLast(refNode);
                }
            }
            else
            {
                lazy = new(() => valueFactory(key));
                refNode = new(new Entry(key, lazy));
                _linkedList.AddLast(refNode);
                if (_dictionary.Count > _boundedCapacity)
                {
                    bool removed = _dictionary.Remove(_linkedList.First.Value.Key);
                    Debug.Assert(removed);
                    _linkedList.RemoveFirst();
                }
            }
            Debug.Assert(_dictionary.Count == _linkedList.Count);
        }
        return lazy.Value;
    }

    public void Clear()
    {
        lock (SyncRoot)
        {
            _dictionary.Clear();
            _linkedList.Clear();
        }
    }

    public IEnumerator<KeyValuePair<TKey, TValue>> GetEnumerator()
    {
        lock (SyncRoot)
        {
            return _linkedList
                .ToArray()
                .Select((Entry e) => KeyValuePair.Create(e.Key, e.Lazy.Value))
                .AsEnumerable()
                .GetEnumerator();
        }
    }

    IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
}

Usage example:

LRUCache<string, object> cache = new(30);
object value = cache.GetOrAdd("SomeKey", key => GetObject(key));

The advanced API CollectionsMarshal.GetValueRefOrAddDefault is used so that the key is hashed only once per GetOrAdd call.

In case the valueFactory fails, the behavior of the Lazy<T> class is to cache permanently the exception. This behavior might not be suitable for a caching system, so you may want to substitute the Lazy<T> with the simple LazyWithRetry<T> implementation that I have posted here.

In case you would like to use an asynchronous valueFactory, there are AsyncLazy<T> implementations in this question.

The LRUCache<TKey, TValue> class is thread-safe.

赢得她心 2024-07-23 12:28:10

如果它是一个 asp.net 应用程序,您可以使用缓存类 [1],但您将与其他缓存内容竞争空间,这可能是您想要的,也可能不是。

[1] http://msdn.microsoft.com /en-us/library/system.web.caching.cache.aspx

If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.

[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx

烏雲後面有陽光 2024-07-23 12:28:10

有 OrderedDictionary

using System.Collections.Specialized;

您可以通过键删除元素并将其(重新)插入到订单末尾。 当您需要内存时,请删除顺序中的第一个元素。

这显示了如何操作,但速度稍慢:

https://leetcode.com/problems/lru-cache/solutions/1065496/c-two-implementationsordered-dictionary-linkedlist-and-their-comparison-with-explanation/< /a>

There is OrderedDictionary

using System.Collections.Specialized;

You can remove a element by key and (re)insert it on the end of the order. When you need memory remove the first element in the order.

This shows how, but its a trifle slower:

https://leetcode.com/problems/lru-cache/solutions/1065496/c-two-implementationsordered-dictionary-linkedlist-and-their-comparison-with-explanation/

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文