基于堆栈的树遍历的时间复杂度
下面实现二叉树遍历的时间复杂度是多少?
void Tree::nonRecInOrder()
{
// nonrecursive inOrder Traversal using Stack
Stack< TreeNode* > s ; // declare and initialize stack
TreeNode* currentNode = root ;
while( true )
{
while( currentNode )
{
// move down leftChild fields
s.add( currentNode ) ;
currentNode = currentNode->leftChild ;
}
if( ! s.isEmpty() ) // stack is not empty
{
currentNode = *s.del( currentNode ) ; // delete from stack
cout << currentNode->data ;
currentNode = currentNode->rightChild ;
}
else
{
break ;
}
}
}
你能解释一下如何计算复杂度吗?
What is the time complexity of the implementation of a binary tree traversal below?
void Tree::nonRecInOrder()
{
// nonrecursive inOrder Traversal using Stack
Stack< TreeNode* > s ; // declare and initialize stack
TreeNode* currentNode = root ;
while( true )
{
while( currentNode )
{
// move down leftChild fields
s.add( currentNode ) ;
currentNode = currentNode->leftChild ;
}
if( ! s.isEmpty() ) // stack is not empty
{
currentNode = *s.del( currentNode ) ; // delete from stack
cout << currentNode->data ;
currentNode = currentNode->rightChild ;
}
else
{
break ;
}
}
}
Could you also explain how to calculate the complexity?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
描述困难函数的大 O 复杂性的一种方法是考虑它访问的某些资源,然后限制该资源被访问的次数。在这种情况下,当您进行树遍历时,您可能会考虑每个节点从堆栈中压入或弹出的次数。这是一个很好的界限的原因是,该函数的所有艰苦工作(内部循环向下穿过节点链,外部循环处理堆栈的最顶层)都可以由节点被推送的次数来限制入栈或从栈弹出。这是因为最外层循环在堆栈为空时终止,因此它运行的次数不能多于堆栈上推入内容的次数,并且最内层循环的工作量与堆栈上推入内容的次数成正比。循环的过程。
那么让我们看看如何限制这些数量。第一个问题是每个节点可以添加到堆栈中多少次。那么,只有当循环开始执行时,该节点或其沿着仅左路径的祖先之一是当前节点时,才会将其添加到堆栈中。这种情况能发生多少次?我的主张是这种情况最多发生一次。证明是基于树中节点深度的归纳。我们观察到,只有当节点是堆栈中节点的直接右子节点时,才会再次选择该节点作为当前节点。作为归纳的基本情况,如果节点是根(深度为零),则不能再次选择它,因为它没有父节点。对于归纳步骤,如果深度 d 处的节点不能被选择两次作为当前节点,则深度 d + 1 处的节点不能被选择两次,因为深度 d + 1 处的节点只有在其父节点被选择的情况下才会被选择再次,但通过归纳假设,我们知道这不是真的。因此,我们没有节点被选为当前节点两次。接下来我们进行一个简单的观察,即左子节点在循环开始时不能成为当前节点,因为当前节点要么是根节点(不是左子节点),要么是某个节点的右子节点。这种说法与没有节点被访问两次的事实相结合,意味着一个节点最多只被添加到队列中一次,这种情况发生在它的左最高祖先成为当前节点时。
这也给了我们可能的出队数量的限制,因为出队数量不能超过入队数量。由于每个节点最多入队一次,因此也最多出队一次。最后,我们知道整个函数的复杂度受到执行的入队和出队次数的限制,因此复杂度为 O(n),其中 n 是节点的数量。
哇!分析起来并不有趣。我更喜欢递归版本。 :-)
One way to characterize the big-O complexity of a difficult function is to think about some resource that it accesses and then to bound the number of times that that resource is accessed. In this case, when you're doing a tree traversal, you might think about the number of times each node is pushed or popped from the stack. The reason this is a good bound is that all the hard work of this function - the inner loop descending down through a chain of nodes and the outer loop processing the topmost on the stack - can be bounded by the number of times a node is pushed onto the stack or popped from the stack. This is because the outermost loop terminates when the stack is empty, so it can't run more times than the stack has something pushed onto it, and the innermost loop does work proportional to the number of times something is pushed onto the stack over the course of the loop.
So let's see how we can bound these quantities. The first question is how many times each node can be added to the stack. Well, a node is only added to the stack if it or one of its ancestors along a left-only path is the current node when the loop starts executing. How many times can this happen? My claim is that this occurs at most once. The proof of this is an induction based on the depth of the node in the tree. We use the observation that a node is only chosen again as the current node if it is the direct right child of a node in the stack. As a base case of the induction, if the node is the root (it's at depth zero), then it can't be chosen a second time because it has no parent. For the inductive step, if it's true that no node at depth d can be chosen as the current node twice, then no node at depth d + 1 can be chosen twice because nodes at depth d + 1 are chosen only if their parents are chosen again, but by the inductive assumption we know this not to be true. Consequently, we have that no node is ever chosen as the current node twice. We follow this up with a simple observation that no node that's a left child can ever be the current node at the start of the loop, since the current node is either the root (not a left child) or was the right child of some node. This claim, compounded with the fact that no node is visited twice, means that a node is only added to the queue at most once, which happens when its highest left ancestor becomes the current node.
This also gives us a bound on the number of dequeues that are possible, since the number of dequeues can't exceed the number of enqueues. Since each node is enqueued at most once, it's also dequeued at most once. To finish things off, we know that the complexity of the whole function is bounded by the number of enqueues and dequeues performed, and so the complexity is O(n), where n is the number of nodes.
Whew! That was not fun to analyze. I like the recursive version a lot more. :-)