将不同的例程组合在一起时的大O

发布于 2024-11-28 17:00:08 字数 229 浏览 1 评论 0原文

假设我有一个例程,它扫描 n 个项目的整个列表 3 次,根据大小进行排序,然后对排序后的列表进行 n 次搜索。扫描时间为 O(n) 时间,我将这种排序称为 O(n log(n)),n 次 bsearch 为 O(n log(n))。如果我们将所有 3 个加在一起,它是否只给出 3 个最坏的情况 - n log(n) 值,或者语义是否允许添加时间?

很确定,现在我输入答案是 n log(n),但我现在不妨确认我已经输入了它:)

Lets say I have a routine that scans an entire list of n items 3 times, does a sort based on the size, and then bsearches that sorted list n times. The scans are O(n) time, the sort I will call O(n log(n)), and the n times bsearch is O(n log(n)). If we add all 3 together, does it just give us the worst case of the 3 - the n log(n) value(s) or does the semantics allow added times?

Pretty sure, now that I type this out that the answer is n log(n), but I might as well confirm now that I have it typed out :)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

何以畏孤独 2024-12-05 17:00:08

对于 Big-O 来说,这个总和确实是三者中最差的。

原因是你的函数的时间复杂度是

(n) => 3n + nlogn + nlogn

这个

(n) => 3n + 2nlogn

函数上界为 3nlogn,所以它的时间复杂度为 O(n log n)。

您可以选择任何常数。我只是碰巧选择了 3,因为它是一个很好的渐近上限。

The sum is indeed the worst of the three for Big-O.

The reason is that your function's time complexity is

(n) => 3n + nlogn + nlogn

which is

(n) => 3n + 2nlogn

This function is bounded above by 3nlogn, so it is in O(n log n).

You can choose any constant. I just happened to choose 3 because it was a good asymptotic upper bound.

我的影子我的梦 2024-12-05 17:00:08

你是对的。当 n 变得非常大时,n log(n) 支配 3n。

You are correct. When n gets really big, the n log(n) dominates 3n.

仅一夜美梦 2024-12-05 17:00:08

是的,这将是最坏的情况,因为 O 表示法只是关于渐近性能。

当然,这并不意味着添加这些额外的步骤不会影响程序的性能。对于程序运行的给定 n 范围,O(n) 步骤之一可能会轻松消耗大部分执行时间。

Yes it will just be the worst case since O-notation is just about asymptotic performance.

This should of course not be taken to mean that adding these extra steps will have no effect on your programs performance. One of the O(n) steps could easily consume a huge portion of your execution time for the given range of n where your program operates.

凹づ凸ル 2024-12-05 17:00:08

正如 Ray 所说,答案确实是 O(n log(n))。这个问题有趣的部分是,无论你以哪种方式看待它并不重要:添加意味着“实际添加”还是意味着“最坏情况”。让我们证明这两种看待它的方式会产生相同的结果。

令 f(n) 和 g(n) 为函数,并且不失一般性地假设 f 是 O(g)。 (通俗地说,g 比 f“更差”。)然后根据定义,存在常数 M 和 k 使得 f(n) < M*g(n) 每当 n > 时k.如果我们以“最坏情况”的方式来看,我们期望 f(n)+g(n) 是 O(g(n))。现在以“实际加法”的方式来看待它,并专门研究 n > 的情况。 k,我们有 f(n) + g(n) < M*g(n) + g(n) = (M+1)*g(n),因此根据定义,f(n)+g(n) 为 O(g(n))。

As Ray said, the answer is indeed O(n log(n)). The interesting part of this question is that it doesn't matter which way you look at it: does adding mean "actual addition" or does it mean "the worst case". Let's prove that these two ways of looking at it produce the same result.

Let f(n) and g(n) be functions, and without loss of generality suppose f is O(g). (Informally, that g is "worse" than f.) Then by definition, there exists constants M and k such that f(n) < M*g(n) whenever n > k. If we look at in the "worst case" way, we expect that f(n)+g(n) is O(g(n)). Now looking at it in the "actual addition" way, and specializing to the case where n > k, we have f(n) + g(n) < M*g(n) + g(n) = (M+1)*g(n), and so by definition f(n)+g(n) is O(g(n)) as desired.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文