单元测试,确保良好的覆盖率,同时避免不必要的测试

发布于 2024-11-27 12:47:17 字数 8474 浏览 1 评论 0原文

我编写了类,它是一个可枚举包装器,它缓存底层可枚举的结果,只有在我们枚举并到达缓存结果的末尾时才获取下一个元素。它可以是多线程的(获取另一个线程中的下一个项目)或单线程的(获取当前线程中的下一个项目)。

我正在阅读 。我的主要问题是我已经编写了我的课程并且正在使用它。它适用于我正在使用它的用途(目前的一件事)。因此,我在编写测试时只是试图考虑可能出错的事情,鉴于我已经非正式地进行了测试,我可能会无意识地编写我知道我已经检查过的测试。 如何在太多/细粒度测试和太少测试之间获得写入平衡?

  1. 我应该只测试公共方法/构造函数还是应该测试每个方法?
  2. 我应该单独测试 CachedStreamingEnumerable.CachedStreamingEnumerator 类吗?
  3. 目前我只在类设置为单线程时进行测试。考虑到我可能需要等待一段时间才能检索项目并将其添加到缓存中,那么在多线程时如何测试它?
  4. 为了确保良好的覆盖范围,我缺少哪些测试?有我已经不需要的吗?

下面是该类的代码和测试类。

CachedStreamingEnumerable

/// <summary>
/// An enumerable that wraps another enumerable where getting the next item is a costly operation.
/// It keeps a cache of items, getting the next item from the underlying enumerable only if we iterate to the end of the cache.
/// </summary>
/// <typeparam name="T">The type that we're enumerating over.</typeparam>
public class CachedStreamingEnumerable<T> : IEnumerable<T>
{
    /// <summary>
    /// An enumerator that wraps another enumerator,
    /// keeping track of whether we got to the end before disposing.
    /// </summary>
    public class CachedStreamingEnumerator : IEnumerator<T>
    {
        public class DisposedEventArgs : EventArgs
        {
            public bool CompletedEnumeration;

            public DisposedEventArgs(bool completedEnumeration)
            {
                CompletedEnumeration = completedEnumeration;
            }
        }

        private IEnumerator<T> _UnderlyingEnumerator;

        private bool _FinishedEnumerating = false;

        // An event for when this enumerator is disposed.
        public event EventHandler<DisposedEventArgs> Disposed;

        public CachedStreamingEnumerator(IEnumerator<T> UnderlyingEnumerator)
        {
            _UnderlyingEnumerator = UnderlyingEnumerator;
        }

        public T Current
        {
            get { return _UnderlyingEnumerator.Current; }
        }

        public void Dispose()
        {
            _UnderlyingEnumerator.Dispose();

            if (Disposed != null)
                Disposed(this, new DisposedEventArgs(_FinishedEnumerating));
        }

        object System.Collections.IEnumerator.Current
        {
            get { return _UnderlyingEnumerator.Current; }
        }

        public bool MoveNext()
        {
            bool MoveNextResult = _UnderlyingEnumerator.MoveNext();

            if (!MoveNextResult)
            {
                _FinishedEnumerating = true;
            }

            return MoveNextResult;
        }

        public void Reset()
        {
            _FinishedEnumerating = false;
            _UnderlyingEnumerator.Reset();
        }
    }

    private bool _MultiThreaded = false;

    // The slow enumerator.
    private IEnumerator<T> _SourceEnumerator;

    // Whether we're currently already getting the next item.
    private bool _GettingNextItem = false;

    // Whether we've got to the end of the source enumerator.
    private bool _EndOfSourceEnumerator = false;

    // The list of values we've got so far.
    private List<T> _CachedValues = new List<T>();

    // An object to lock against, to protect the cached value list.
    private object _CachedValuesLock = new object();

    // A reset event to indicate whether the cached list is safe, or whether we're currently enumerating over it.
    private ManualResetEvent _CachedValuesSafe = new ManualResetEvent(true);
    private int _EnumerationCount = 0;

    /// <summary>
    /// Creates a new instance of CachedStreamingEnumerable.
    /// </summary>
    /// <param name="Source">The enumerable to wrap.</param>
    /// <param name="MultiThreaded">True to load items in another thread, otherwise false.</param>
    public CachedStreamingEnumerable(IEnumerable<T> Source, bool MultiThreaded)
    {
        this._MultiThreaded = MultiThreaded;

        if (Source == null)
        {
            throw new ArgumentNullException("Source");
        }

        _SourceEnumerator = Source.GetEnumerator();
    }

    /// <summary>
    /// Handler for when the enumerator is disposed.
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    private void Enum_Disposed(object sender,  CachedStreamingEnumerator.DisposedEventArgs e)
    {
        // The cached list is now safe (because we've finished enumerating).
        lock (_CachedValuesLock)
        {
            // Reduce our count of (possible) nested enumerations
            _EnumerationCount--;
            // Pulse the monitor since this could be the last enumeration
            Monitor.Pulse(_CachedValuesLock);
        }

        // If we've got to the end of the enumeration,
        // and our underlying enumeration has more elements,
        // and we're not getting the next item already
        if (e.CompletedEnumeration && !_EndOfSourceEnumerator && !_GettingNextItem)
        {
            _GettingNextItem = true;

            if (_MultiThreaded)
            {
                ThreadPool.QueueUserWorkItem((Arg) =>
                {
                    AddNextItem();
                });
            }
            else
                AddNextItem();
        }
    }

    /// <summary>
    /// Adds the next item from the source enumerator to our list of cached values.
    /// </summary>
    private void AddNextItem()
    {
        if (_SourceEnumerator.MoveNext())
        {
            lock (_CachedValuesLock)
            {
                while (_EnumerationCount != 0)
                {
                    Monitor.Wait(_CachedValuesLock);
                }

                _CachedValues.Add(_SourceEnumerator.Current);
            }
        }
        else
        {
            _EndOfSourceEnumerator = true;
        }

        _GettingNextItem = false;
    }

    System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
    {
        return GetEnumerator();
    }

    public IEnumerator<T> GetEnumerator()
    {
        lock (_CachedValuesLock)
        {
            var Enum = new CachedStreamingEnumerator(_CachedValues.GetEnumerator());

            Enum.Disposed += new EventHandler<CachedStreamingEnumerator.DisposedEventArgs>(Enum_Disposed);

            _EnumerationCount++;

            return Enum;
        }
    }
}

CachedStreamingEnumerableTests

[TestFixture]
public class CachedStreamingEnumerableTests
{
    public bool EnumerationsAreSame<T>(IEnumerable<T> first, IEnumerable<T> second)
    {
        if (first.Count() != second.Count())
            return false;

        return !first.Zip(second, (f, s) => !s.Equals(f)).Any(diff => diff);
    }

    [Test]
    public void InstanciatingWithNullParameterThrowsException()
    {
        Assert.Throws<ArgumentNullException>(() => new CachedStreamingEnumerable<int>(null, false));
    }

    [Test]
    public void SameSequenceAsUnderlyingEnumerationOnceCached()
    {
        var SourceEnumerable = Enumerable.Range(0, 10);
        var CachedEnumerable = new CachedStreamingEnumerable<int>(SourceEnumerable, false);

        // Enumerate the cached enumerable completely once for each item, so we ensure we cache all items
        foreach (var x in SourceEnumerable)
        {
            foreach (var i in CachedEnumerable)
            {

            }
        }

        Assert.IsTrue(EnumerationsAreSame(Enumerable.Range(0, 10), CachedEnumerable));
    }

    [Test]
    public void CanNestEnumerations()
    {
        var SourceEnumerable = Enumerable.Range(0, 10).Select(i => (decimal)i);
        var CachedEnumerable = new CachedStreamingEnumerable<decimal>(SourceEnumerable, false);

        Assert.DoesNotThrow(() =>
            {
                foreach (var d in CachedEnumerable)
                {
                    foreach (var d2 in CachedEnumerable)
                    {

                    }
                }
            });
    }
}

I've written class, which is an enumerable wrapper that caches the results of an underlying enumerable, only getting the next element if we enumerate and reach the end of the cached results. It can be multi-threaded (getting the next item in another thread) or single threaded (getting the next item in the current thread).

I'm reading up on and would like to get my head around appropriate tests. I'm using . My main issue is that i've already written my class and am using it. It works for what i'm using it for (one thing currently). So, i'm writing my tests by just trying to think of things that could go wrong, which given that i've tested unofficially i'm probably unconsciously writing tests i know i've already checked. How can i get the write balance between too many/fine-grained tests, and too few tests?

  1. Should i only be testing public methods/constructors or should i test every method?
  2. Should i test the CachedStreamingEnumerable.CachedStreamingEnumerator class separately?
  3. Currently i'm only testing when the class is set to be single-threaded. How do i go about testing it when multi-threaded, given that i might need to wait a period of time before an item is retrieved and added to the cache?
  4. What tests am i missing to ensure good coverage? Are any i've already got not needed?

Code for the class, and test class below.

CachedStreamingEnumerable

/// <summary>
/// An enumerable that wraps another enumerable where getting the next item is a costly operation.
/// It keeps a cache of items, getting the next item from the underlying enumerable only if we iterate to the end of the cache.
/// </summary>
/// <typeparam name="T">The type that we're enumerating over.</typeparam>
public class CachedStreamingEnumerable<T> : IEnumerable<T>
{
    /// <summary>
    /// An enumerator that wraps another enumerator,
    /// keeping track of whether we got to the end before disposing.
    /// </summary>
    public class CachedStreamingEnumerator : IEnumerator<T>
    {
        public class DisposedEventArgs : EventArgs
        {
            public bool CompletedEnumeration;

            public DisposedEventArgs(bool completedEnumeration)
            {
                CompletedEnumeration = completedEnumeration;
            }
        }

        private IEnumerator<T> _UnderlyingEnumerator;

        private bool _FinishedEnumerating = false;

        // An event for when this enumerator is disposed.
        public event EventHandler<DisposedEventArgs> Disposed;

        public CachedStreamingEnumerator(IEnumerator<T> UnderlyingEnumerator)
        {
            _UnderlyingEnumerator = UnderlyingEnumerator;
        }

        public T Current
        {
            get { return _UnderlyingEnumerator.Current; }
        }

        public void Dispose()
        {
            _UnderlyingEnumerator.Dispose();

            if (Disposed != null)
                Disposed(this, new DisposedEventArgs(_FinishedEnumerating));
        }

        object System.Collections.IEnumerator.Current
        {
            get { return _UnderlyingEnumerator.Current; }
        }

        public bool MoveNext()
        {
            bool MoveNextResult = _UnderlyingEnumerator.MoveNext();

            if (!MoveNextResult)
            {
                _FinishedEnumerating = true;
            }

            return MoveNextResult;
        }

        public void Reset()
        {
            _FinishedEnumerating = false;
            _UnderlyingEnumerator.Reset();
        }
    }

    private bool _MultiThreaded = false;

    // The slow enumerator.
    private IEnumerator<T> _SourceEnumerator;

    // Whether we're currently already getting the next item.
    private bool _GettingNextItem = false;

    // Whether we've got to the end of the source enumerator.
    private bool _EndOfSourceEnumerator = false;

    // The list of values we've got so far.
    private List<T> _CachedValues = new List<T>();

    // An object to lock against, to protect the cached value list.
    private object _CachedValuesLock = new object();

    // A reset event to indicate whether the cached list is safe, or whether we're currently enumerating over it.
    private ManualResetEvent _CachedValuesSafe = new ManualResetEvent(true);
    private int _EnumerationCount = 0;

    /// <summary>
    /// Creates a new instance of CachedStreamingEnumerable.
    /// </summary>
    /// <param name="Source">The enumerable to wrap.</param>
    /// <param name="MultiThreaded">True to load items in another thread, otherwise false.</param>
    public CachedStreamingEnumerable(IEnumerable<T> Source, bool MultiThreaded)
    {
        this._MultiThreaded = MultiThreaded;

        if (Source == null)
        {
            throw new ArgumentNullException("Source");
        }

        _SourceEnumerator = Source.GetEnumerator();
    }

    /// <summary>
    /// Handler for when the enumerator is disposed.
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    private void Enum_Disposed(object sender,  CachedStreamingEnumerator.DisposedEventArgs e)
    {
        // The cached list is now safe (because we've finished enumerating).
        lock (_CachedValuesLock)
        {
            // Reduce our count of (possible) nested enumerations
            _EnumerationCount--;
            // Pulse the monitor since this could be the last enumeration
            Monitor.Pulse(_CachedValuesLock);
        }

        // If we've got to the end of the enumeration,
        // and our underlying enumeration has more elements,
        // and we're not getting the next item already
        if (e.CompletedEnumeration && !_EndOfSourceEnumerator && !_GettingNextItem)
        {
            _GettingNextItem = true;

            if (_MultiThreaded)
            {
                ThreadPool.QueueUserWorkItem((Arg) =>
                {
                    AddNextItem();
                });
            }
            else
                AddNextItem();
        }
    }

    /// <summary>
    /// Adds the next item from the source enumerator to our list of cached values.
    /// </summary>
    private void AddNextItem()
    {
        if (_SourceEnumerator.MoveNext())
        {
            lock (_CachedValuesLock)
            {
                while (_EnumerationCount != 0)
                {
                    Monitor.Wait(_CachedValuesLock);
                }

                _CachedValues.Add(_SourceEnumerator.Current);
            }
        }
        else
        {
            _EndOfSourceEnumerator = true;
        }

        _GettingNextItem = false;
    }

    System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
    {
        return GetEnumerator();
    }

    public IEnumerator<T> GetEnumerator()
    {
        lock (_CachedValuesLock)
        {
            var Enum = new CachedStreamingEnumerator(_CachedValues.GetEnumerator());

            Enum.Disposed += new EventHandler<CachedStreamingEnumerator.DisposedEventArgs>(Enum_Disposed);

            _EnumerationCount++;

            return Enum;
        }
    }
}

CachedStreamingEnumerableTests

[TestFixture]
public class CachedStreamingEnumerableTests
{
    public bool EnumerationsAreSame<T>(IEnumerable<T> first, IEnumerable<T> second)
    {
        if (first.Count() != second.Count())
            return false;

        return !first.Zip(second, (f, s) => !s.Equals(f)).Any(diff => diff);
    }

    [Test]
    public void InstanciatingWithNullParameterThrowsException()
    {
        Assert.Throws<ArgumentNullException>(() => new CachedStreamingEnumerable<int>(null, false));
    }

    [Test]
    public void SameSequenceAsUnderlyingEnumerationOnceCached()
    {
        var SourceEnumerable = Enumerable.Range(0, 10);
        var CachedEnumerable = new CachedStreamingEnumerable<int>(SourceEnumerable, false);

        // Enumerate the cached enumerable completely once for each item, so we ensure we cache all items
        foreach (var x in SourceEnumerable)
        {
            foreach (var i in CachedEnumerable)
            {

            }
        }

        Assert.IsTrue(EnumerationsAreSame(Enumerable.Range(0, 10), CachedEnumerable));
    }

    [Test]
    public void CanNestEnumerations()
    {
        var SourceEnumerable = Enumerable.Range(0, 10).Select(i => (decimal)i);
        var CachedEnumerable = new CachedStreamingEnumerable<decimal>(SourceEnumerable, false);

        Assert.DoesNotThrow(() =>
            {
                foreach (var d in CachedEnumerable)
                {
                    foreach (var d2 in CachedEnumerable)
                    {

                    }
                }
            });
    }
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

另类 2024-12-04 12:47:17

广告 1)
如果您需要测试私有方法,这应该告诉您一些信息;可能你的班级有太多的责任。很多时候,私有方法是等待诞生的单独类:-)

Ad 2)
是的

广告 3)
遵循与 1 相同的论点,如果可以避免的话,线程功能可能不应在类内部完成。我记得在 Robert Martin 的《整洁代码》中读到过一些有关此内容的内容。他指出,线程是一个单独的问题,应该与业务逻辑的其他部分分开。

广告 4)
私有方法是最难覆盖的。因此,我再次转向我的答案1。如果您的私有方法是单独类中的公共方法,那么它们会更容易覆盖。另外,你的主类的测试会更容易理解。

问候,
莫滕

Ad 1)
If you need to test private methods, this should tell you something; probably that your class has too much responsibilities. Quite often, private methods are separate classes waiting to be born :-)

Ad 2)
Yes

Ad 3)
Following the same argument as 1, threading functionality should probably not be done inside the class if it can be avoided. I recall reading something about this in "Clean Code" by Robert Martin. He states something like that threading is a separate concern, that should be separated from other peaces of business logic.

Ad 4)
The private methods are the hardest to cover. Thus, I again turn to my answer 1. If your private methods were public methods in seperate classes, they would be much easier to cover. Also, the test of your main class would be easier to understand.

Regards,
Morten

绳情 2024-12-04 12:47:17

我不会向您提供大量细节,只是建议您在创建测试时要实际并遵循“关键少数法则”。您不需要测试每个访问器或行业标准代码的每个小片段。

想想哪些事情会对你的班级造成最严重的伤害并加以防范。检查边界条件。利用您对过去经验中可能破坏类似代码的任何记忆。尝试测试可能意外的数据值。

您可能不会将其作为一项学术练习。您可能想要确保您的类是可靠的,并且当您稍后返回重构它时或者当您想要确保它不是其客户端类之一的不当行为的原因时,它将保持这种状态。

你的每一次测试都应该有一个原因,而不仅仅是为了让你在下一次 TDD 俱乐部会议上表现得很酷!

Rather than riddle you with details, I'd simply advise you to be practical and to follow the "Law of the Critical Few" when creating your tests. You do not need to test every accessor or every small fragment of industry-standard code.

Think of what kinds of things would hurt your class the worst and guard against them. Check for boundary conditions. Use any memories you have as to what may have broken similar code in your past experience. Try test data values that may be unexpected.

You are probably not doing this as an academic exercise. You probably want to ensure that your class is solid and that it will stay that way when you go back later to refactor it or when you want to ensure that it is not the cause of misbehavior in one of its client classes.

Your every test should be there for a reason, not just so you can be cool at the next TDD club meeting!

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文