快速排序枢轴点
对于快速排序(在java中,如果重要的话),枢轴点(或枢轴索引)的数量和给定数组的大小之间是否存在关系?例如,如果数组大小为 10,是否总是会有 5 个枢轴点?
For quicksort, (in java, if it matters), is there a relationship between the number of pivot points (or pivot indices) and the size of a given array? For example, if the array size is 10, are there always going to be, say, 5 pivot points?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
是的,大约 n/2 是正确的。但是,我不知道为什么这很重要。
Yes, about n/2 is correct. However, I don't know why it would matter.
未必。这取决于数据和你的算法。平均而言,对于适当的随机数据和适当的实现,它应该约为 log2(n) 个枢轴。
Not necessarily. It depends on the data and your algorithm. On average with decently random data and a decent implementation it should be on the order of log2(n) pivots.
你可以有你想要的多少个主元(我想最多n个......)
你拥有的主元越多,下一次递归的效率就越高,尽管如果它太大(特别是如果它是非常量的)你寻找支点的时间会比你获得的时间多。
我相信典型的是每次迭代有 3 个潜在的枢轴,但这完全取决于实现。
但请记住,在最坏的情况下,您最终将进行 n 次迭代(快速排序的最坏情况是
O(n^2)
)。这自然会需要 n 个主元,并且每次迭代只需要做很少的工作。现在,在最后一次迭代中,您可以预期大约 n/3 个主元。在上面的迭代中,它将是 n/6。在下一次迭代中,它将是 n/12。如果你采用该系列的极限,你最终会得到 0.6 的重复。所以看起来你可以预期 2/3 n 总枢轴(因为你有大约 2/3 n 总迭代)
You could have how ever many pivots you want (up to n I suppose...)
The more pivots you have, the more efficient the next recursion will be, although if it were too large (especially if it were non-constant) you would lose more time finding your pivot than you would gain.
I believe the typical is 3 potential pivots per iteration, but it's entirely dependent on implementation.
But remember that in the worst case, you're going to end up with n iterations (quicksort's worst case is
O(n^2)
). That would naturally lend itself to requiring n pivots, and each iteration would do very little work.Now, on the last iteration, you can expect about n/3 pivots. On the iteration above that, it would be n/6. On the next iteration it would be n/12. If you take the limit of that series, you end up with .6 repeating. So it looks like you can expect 2/3 n total pivots (because you'd have about 2/3 n total iterations)