与O(N)相比,O(1/N)的复杂性是多少?
O(1/N)生长速度比O(1)快吗?我正在研究时间的复杂性,并将O(1/n)与O(N)进行比较,这是我的锻炼问题之一,而我以前从未见过。不确定如何推导这个问题的答案。
Is O(1/n) faster growing than O(1)? I am studying time complexity and saw O(1/n) being compared to O(n) as one of my exercise questions, and I had never seen that before. Not sure how to deduce the answer to this question.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
o(1/n)
的复杂性意味着您处理的数据越多,速度越快是算法...首先,很难相信,第二,我们进行数学: 1/x x到 +inf是零 ...那算法会立即解决问题吗?嘿,让我们忘记量子计算,我们发现了更好的东西! :)
停止开玩笑:这种复杂性不存在。因为
1/n
是降低单调函数。复杂性是 增加单调函数 - 充其量是o(1)
,这意味着无论数据数量是什么。实际上,即使对于某些操作 /操纵非常频繁,这甚至不是如此普遍的复杂性。例如,检索标准链接列表的头部的头部确实是
o(1)
,即使列表为空或包含宇宙的所有可能数据(如果它是可持续的...),则因为List的头是访问它的存储的内容。对于仅涉及仅交换指针/手柄的所有操作,所有直接访问(例如[]
大多数数组运算符)都是一样的,但是大多数算法都没有很好的复杂性。但是,即使是简单的(深)副本也是
o(n)
...存储中的大多数研究都在o(log2(n))
中。大多数是o(n.log2(n))
中的。大多数跨案例中的人都在o(n²)
中。所有这些功能都在增加。当n也倾向于无穷大时,所有这些功能往往无穷大。A complexity of
O(1/n)
would means that the more data you process, the faster is the algorithm... Quite difficult to believe, for first, and second let's do maths: the limit of 1/x when x goes to +INF is zero...The algorithm would resolve instantly the problem, then? Hey, let's forget about quantum computing, we found something better! :)
Stop joking: such a complexity doesn't exist. Because
1/n
is a decreasing monotonic function. Complexities are increasing monotonic functions - at best, it'sO(1)
, meaning a constant time whatever the data quantity is. It's not even a so common complexity for an algorithm, in fact, even if it's quite frequent for certain operations / manipulations.For example, retrieving the head of a standard linked list is indeed
O(1)
, even if the list is empty or if it contains all possible data of Universe (if it was storable...), because list's head is what is stored to access it. It's the same for all operations involving only exchanging pointers/handles exclusively, all direct accesses (like the[]
operator of most arrays), etc. but most algorithms don't have such a nice complexity.But even a simple (deep) copy is
O(n)
... Most researchs in a storage are inO(log2(n))
. Most sorts are inO(n.log2(n))
. Most cross-comparisons are inO(n²)
. All these functions are (strictly) increasing. All these functions tend to infinity when n also tends to infinity.