函数的渐近增长

发布于 2024-09-25 00:08:46 字数 289 浏览 2 评论 0原文

如何确定给定的 f(n) 和 g(n) 是否在 theta、omega、big oh、little omega 或 Little oh 中? - 我认为一种方法是绘制函数 f(n) 和 g(n) 的图形。即使通过绘制图表,我们如何判断 af(n) 位于 theta、omega、big oh、little omega 或 Little oh 中?我对此不清楚。有人可以提供更多细节吗?

我也不确定这是否是正确的方法。谁能告诉我是否还有其他更简单的方法可以做到这一点。 举例说明: f(n) = sq root(n) 和 g(n) = n^sin n

How to determine if a given f(n) and g(n) is in theta, omega, big oh, little omega, or little oh?
- I think one way of doing it is by plotting graphs of both functions f(n) and g(n). Even by plotting graphs how do we say when a f(n) is in theta, omega, big oh, little omega, or little oh? Am not clear on that. Can someone throw more details on that?

Also I am not sure if this is the correct way. Can anybody tell me if there is any other simpler way of doing this.
Say example: f(n) = sq root(n) and g(n) = n^sin n

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

看春风乍起 2024-10-02 00:08:46

你不能通过目视来确定渐近增长。每种类型的渐近增长关系都有正式的定义,可以准确地解释它们的含义。您可以通过数学方式确定两个函数是否满足关系的形式定义来确定它们是否符合给定的关系。

例如,Big-O 的几个等效形式定义之一如下:

f = O(g) 当且仅当 lim [ n -> inf ] ( f(n) / g(n) ) <信息

,例如,如果 f(n) = n 且 g(n) = n^2,您可以观察到极限为 n -> n/n^2 = 0 的无穷大,因此 f = O(g)。

或者,如果 f(n) = n 且 g(n) = 2n,您可以观察到 n-> inf,n/2n-> 1/2< inf,所以再次 f = O(g)。

然而,如果 f(n) = n 且 g(n) = sqrt(n),您可以观察到极限为 n -> n 的无穷大 / sqrt(n) = 无穷大,因此 f != O(g)。

你的例子 f 和 g 很棘手,因为随着 n 的增加,正弦在 -1 和 1 之间振荡。但 Wolfram Alpha 告诉我,所讨论的极限是无穷大,因此 sqrt(n) != O(n^(sin(n))。

每个其他渐近增长关系都有类似的形式定义,您可以测试它是否两个函数 。

编辑:

似乎您正在寻找经验法则,以确定相对简单的函数 f 和 g 的渐近阶数 每个函数中 n 的幂:

  • 如果最高幂相同,则 f = O(g)、g = O(f)、f = Omega(g)、g = Omega(f)、f = Theta(g) , g = Theta(f)
  • 如果 f 的最高幂小于 g 的最高幂,则 f = O(g), f = o(g), g = Omega(f), g = omega(f)
  • 如果 f 的最高幂大于 g 的最高幂,则 f = Omega(g), f = omega(g), g = O(f), g = o(f)

当然,如果函数是不是 n 中的多项式,或者更复杂,并且不容易确定最高幂是什么(例如您使用正弦的示例),您仍然必须恢复到正式定义。

有些事情总是正确的:

  • Theta 根据定义是 O 和 Omega 的结合。 f = O(g) ^ f = Omega(g) <=> f = Theta(g) [其中 ^ 代表逻辑“与”]
  • “小”关系比“大”关系更强。这意味着 f = o(g) => f = O(g) 且 f = omega(g) => f = Omega(g),但相反方向则不成立。

You don't determine asymptotic growth by eyeballing it. There are formal definitions for each type of asymptotic growth relation that explain exactly and precisely what they mean. You determine whether two functions fit a given relation by determining mathematically if they satisfy the formal definition for the relation.

For example, one of several equivalent formal definitions for Big-O is as follows:

f = O(g) if and only if lim [ n -> inf ] ( f(n) / g(n) ) < inf

So, for example, if f(n) = n, and g(n) = n^2, you can observe that the limit as n -> infinity of n/n^2 = 0, and so f = O(g).

Or, if f(n) = n and g(n) = 2n, you can observe that as n-> inf, n / 2n -> 1/2 < inf, so again f = O(g).

However, if f(n) = n and g(n) = sqrt(n), you can observe that the limit as n -> infinity of n / sqrt(n) = infinity, so f != O(g).

Your example f and g is tricky to because sine oscillates between -1 and 1 as n increases. But Wolfram Alpha tells me that the limit in question is infinity, so sqrt(n) != O(n^(sin(n)).

Each other asymptotic growth relation has a similar formal definition that you can test to see if two functions satisfy it with respect to each other.

Edit:

It seems like you're looking for rules of thumb. Here's a quick and easy way to determine asymptotic order for relatively simple functions f and g. Consider the highest power of n in each function:

  • If the highest power is the same, then f = O(g), g = O(f), f = Omega(g), g = Omega(f), f = Theta(g), g = Theta(f)
  • If the highest power in f is smaller than the highest power in g, then f = O(g), f = o(g), g = Omega(f), g = omega(f)
  • If the highest power in f is larger than the highest power in g, then f = Omega(g), f = omega(g), g = O(f), g = o(f)

Of course, if the functions are not polynomials in n, or are more complex and it's not easy to determine what the highest powers are (e.g. your example using sine), you will still have to revert back to the formal definition.

Some things that are always true:

  • Theta is by definition the conjunction of O and Omega. f = O(g) ^ f = Omega(g) <=> f = Theta(g) [where ^ represents logical "and"]
  • The "little" relations are stronger than the "big" relations. This means that f = o(g) => f = O(g) and f = omega(g) => f = Omega(g), but the reverse directions are not true.
感情洁癖 2024-10-02 00:08:46

以下是如何使用极限计算给定 f(n) 和 g(n) 的 Big-O。

Big-Oh

在同一集合上绘制 f(x) 和 g(x) 的大 x 值的函数轴应该可以帮助您立即识别渐近行为。

Here's how you compute Big-O given f(n) and g(n) using limits.

Big-Oh

Plotting the functions for large values of x of both f(x) and g(x) on the same set of axes should help you immediately identify the asymptotic behavior.

旧街凉风 2024-10-02 00:08:46

首先,您需要了解所调用函数的复杂性。这可能已记录在案,但在可能的情况下,您必须查看这些函数的来源。

在简单的情况下,您可以计算嵌套循环的数量以了解需要多长时间。如果从 1 到 t 的循环内部有一个从 1 到 n 的循环,并且内部调用的唯一函数需要恒定时间(或者您只是在内部进行算术运算),那就是 Theta(nt)。如果你碰巧知道 t <= n,那么它也是 O(n2)。

在更复杂的情况下,您可能需要使用主定理等。在某些情况下,确定复杂性可能非常困难(研究级别)!

First, you need to know the complexity of the function you're calling. This may be documented, but in the likely case that it is not you'll have to look at the source of those functions.

In the simple case you can count nested loops to get an idea of how long things take. If you have a loop from 1 to n inside a loop from 1 to t, and the only functions called inside take constant time (or you just do arithmetic operations inside, say), that's Theta(nt). If you happen to know that t <= n, it's also O(n2).

In more complicated cases you may need to use the Master Theorem or the like. In some cases it can be very hard (research-level) to determine the complexity!

友谊不毕业 2024-10-02 00:08:46

从理论上讲,绘制函数图是不够的。 (虽然绘制一个图表,然后据此进行推理。而且,实际上,如果您为足够大的值绘制它就足够了)

正确的方法是构造一个证明:

函数 f(n) 是 O(g(n)) 当且仅当存在正 Kn0,使得 f(n) 对于每个 n > K × g(x) n0

假设对于 f(n) = sqrt(n)g(n) = n^sin n 以及一些 Kn0
现在考虑一个

  1. 大于此 n0 的数字 n1
  2. 大于 1
  3. 大于
  4. ,其中方程 g(n1) = 1 成立

有无数个数字满足这些条件:当 sin n1 = 0 时,g(n1) = 1 ,即对于任何整数a,n1 = a × π。将它们限制为 n1 > max(n0, 1, K²) 仍然留下无限多种可能性,所以我们选择一个。

所以我们得到了g(n1) = 1f(n1) 的值是多少?这就是 sqrt(n1),它比 K 大(因为 n1 > K²)。

让我们把它放在一起:K < f(n1)< K × g(n1) = K,换句话说 K K,这不可能是真的,这意味着最初的假设是错误的,所以f(n) != O(g(n))

Plotting a graph of the functions is theoretically not enough. (Although plotting a graph and then reasoning based on that might. Also, practically, it should be enough if you plot it for large enough values)

The correct way is to construct a proof:

The function f(n) is O(g(n)) iff there exist positive K and n0, such that f(n) < K × g(x) for every n > n0.

Let's assume this is true for f(n) = sqrt(n) and g(n) = n^sin n and some K and n0.
Now consider a number n1 that

  1. is bigger than this n0
  2. is bigger than 1
  3. is bigger than
  4. for which the equation g(n1) = 1 holds

There is infinitely many numbers that satisfy those conditions: g(n1) = 1 when sin n1 = 0, that is n1 = a × π for any integer a. Limiting them to n1 > max(n0, 1, K²) still leaves infinitely many possibilities, so we pick one.

So we've got g(n1) = 1. What's the value of f(n1)? That's sqrt(n1), which is bigger than K (because n1 > K²).

Let's put it together: K < f(n1) < K × g(n1) = K, in other words K < K, which can't be true, that means the original assumption is wrong and so f(n) != O(g(n)).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文