2

Suppose we have in-order traversal order and post-order traversal with us. for example: in-order: 30 40 45 50 65 70 80 Post-order: 30 45 40 65 80 70 50

I know how to construct a Binary Search Tree from given in-order traversal and post-order traversal, but my question is what will be the average and worst case time complexity of B.S.T construction if a post-order traversal is given?

Manu
  • 73
  • 1
  • 1
  • 11
  • 1
    If you know how to implement something, count primitive operations. If it's recursive, write a recurrence. – n. m. could be an AI Aug 11 '15 at 16:56
  • 2
    It depends on the algorithm you use for that. For example, here is one that works in O (n log n) in the worst case: http://stackoverflow.com/a/13168162/1488799 – Gassa Aug 11 '15 at 17:33
  • possible duplicate of [How to construct BST given post-order traversal](http://stackoverflow.com/questions/13167536/how-to-construct-bst-given-post-order-traversal) – Aivean Aug 11 '15 at 17:39
  • is it possible that we can create a BST from Post order traversal in O(n)? though we need to sort post order traversal first in order to know in-order – Manu Aug 12 '15 at 04:05

2 Answers2

1

In both cases for naive BST constructing algorithm it will be time O(n^2), because:

in in-order case the algorithm will be adding to the right

in post-order case the algorithm will be adding to the left side

So T(n) = 1 + 2 + 3 + ... n-1 = O(n^2)

UPDATE_3: But in post-order case we can simply add every next element as a root (previous tree becomes left son), so the complexity is O(n)

UPDATE: Of course the average time for a permutation of numbers is O(n logn), but in those cases this time is O(n^2) (n is number of numbers)

UPDATE_2: For more details look at the comments at the bottom.

  • Given both traversals, it's possible to do much better than O(N^2). See comments to the original post. – Jim Mischel Aug 11 '15 at 17:41
  • I understand that, but I thought he meant the basic version of BST constraction algorithm and he is inserting numbers in ordered sequence. Then complexity is O(n^2) – jan_kowalski_314 Aug 11 '15 at 17:57
  • 1
    You answered the wrong question. Your answer addresses the time complexity of creating a BST in the naive way from an inorder traversal. That's not the question he asked. He asked (although not very well) about the best and worst time complexity of using the algorithm that takes advantage of the order inherent in the post-order traversal. – Jim Mischel Aug 11 '15 at 18:14
  • In this case we can even remember pointer to the biggest element in tree and add every element in O(1). Of course then this BST will be also deformed. – jan_kowalski_314 Aug 11 '15 at 18:20
  • There are O(n) algorithms for constructing a BST from both traversals. "You answered the wrong question" [2]. – Juan Lopes Aug 11 '15 at 20:53
  • @Juan Lopez, would you like to create AVL tree or maybe simple BST? For simple BST O(n) is enough we are adding next element as a root (previous tree is left son). For AVL (algorith was explained above) we have got O(n log n) complexity. Do you agree with that? – jan_kowalski_314 Aug 11 '15 at 21:22
  • I agree that some algorithms may have an O(n log n) complexity. But the *problem complexity* (what the OP is asking) is O(n). And nowhere he/she mentions AVL tree. – Juan Lopes Aug 11 '15 at 23:20
  • how can it be possible that we can create a BST from post order traversal in O(n)? because creation of BST takes O(nlogn), every time i need to place an element at its correct position and for doing that i need to compare it with other elements in the tree that takes O(logn) and there are n elements so O(nlogn). – Manu Aug 12 '15 at 04:03
  • @jan_kowalski_314 can you please explain this : UPDATE_3: But in post-order case we can simply add every next element as a root (previous tree becomes left son), so the complexity is O(n). how is it possible – Manu Aug 12 '15 at 04:07
  • We should think only about keep the characteristics of BST (each element in left subtree is smaller and each element in right subtree is bigger). So we are adding each element as a root (the previous tree becomes left son). We know that every next element is bigger than the previous elements, so the BST characteristic is keep. Draw it to see that. – jan_kowalski_314 Aug 12 '15 at 05:19
  • @Manu I added an algorithm to construct the tree from post order traversal in O(n). – Juan Lopes Aug 12 '15 at 11:29
  • To avoid deformation of the tree we can take center element (root) and the left tree will be constructed using smaller numbers and right tree using bigger numbers (everything using recursion). The complexity of this algorithm will be T(n) = T(n/2) + T(n/2) + C so T(n) = O(n) so that's the same complexity. – jan_kowalski_314 Aug 12 '15 at 11:53
0

You can construct the tree using the post-order traversal in O(n) using the following rationale:

  • The last element is always the tree root
  • All the previous elements greater than the root are part of the right subtree. All the elements lesser are part of the left subtree.
  • You can apply the above rules recursively.

The construction from in-order is even more trivial. You just have to pick the middle element as root and call it recursively on both sides.

Here's an example implementation (in Python) that shows both constructions:

from collections import deque

def print_graphviz(tree):
    if not isinstance(tree, tuple):
        return tree
    left = print_graphviz(tree[0])
    right = print_graphviz(tree[2])

    if left is not None: print tree[1], '->', left
    if right is not None: print tree[1], '->', right
    return tree[1]

def visit_post_order(in_queue, limit = None):
    if len(in_queue) == 0 or in_queue[-1] < limit:
        return None
    root = in_queue.pop()
    right = visit_post_order(in_queue, max(root, limit))
    left = visit_post_order(in_queue, limit)

    if left is None and right is None:
        return root
    else:
        return (left, root, right)

def visit_in_order(in_order, begin, end):
    if begin==end: return None
    if begin+1==end: return in_order[begin]

    mid = (begin+end)/2
    root = in_order[mid]
    left = visit_in_order(in_order, begin, mid)
    right = visit_in_order(in_order, mid+1, end)

    if left is None and right is None:
        return root
    else:
        return (left, root, right)

def from_post_order(post_order):
    return visit_post_order(deque(post_order))

def from_in_order(in_order):
    return visit_in_order(in_order, 0, len(in_order))

print 'digraph {'
print print_graphviz(from_post_order([0, 2, 1, 4, 3, 6, 8, 7, 5]))
#print print_graphviz(from_in_order([1, 2, 3, 4, 5, 6, 7, 8]))
print '}'

Run like this:

python test.py | dot -Tpng | display

And you'll have a nice tree output:

tree

Juan Lopes
  • 10,143
  • 2
  • 25
  • 44
  • Am I missing something, or why do you use a deque for the post-order case if you are only popping from the end? A normal list would be simpler and faster if you don't need dual-ended or from-the-left operation. – NeilenMarais Mar 08 '17 at 15:38