0

Many recursive solutions to problems of size N follow the pattern:

Step 1: Solve the problem for the smallest input (say, n = 1)

Step 2: Given the solution to the same problem of size n = k-1 (k <= N), solve it for n = k.

We can see the inductive nature of this, which is why we typically use induction to prove recursive algorithms. For some problems, like recursive Fibonacci, this pattern emerges in an obvious way. My question is if, say, Binary Tree Traversal can be seen as also following this pattern?

Take Depth First Search for instance:

def DFS(root):

  if root == None:
    return
  print(root.value)
  DFS(root.left)
  DFS(root.right)

I understand the code and the call stack, but I'm struggling to articulate exactly what the steps 1 and 2 are for DFS (in holding with the structure of recursive solutions outlined above). It can't be that the only base case is when the current node is None, because traversing a single node should also be a base case.

If I say that line 4, the one containing the print statement, is a base case it wouldn't make much sense because it runs for every sub-tree.

  • 2
    That looks more like a preorder traversal than a DFS, if I'm not mistaken...? Anyway, how is a single node a base case? Only `root == None` is a base case in the algoriithm shown. With a single (leaf) node here, you still recurse and visit its children, even though they're `None`. Trying to apply induction to this seems like overthinking it a bit. I'm not sure what benefit it offers, exactly. – ggorlen Sep 05 '21 at 20:54
  • @ggorlen True, it's just pre-order traversal without search. The benefit, to me, would be to connect two concepts together and gain a deeper understanding. Also, sometimes in thinking "what's the simplest version of this problem?" and "How would I go from knowing a solution to a smaller problem to the problem that's one step larger" helps in designing the algorithms... I don't feel like I can apply those steps to the tree traversal problems, so I tend to struggle with them. – Vahram Poghosyan Sep 06 '21 at 00:23
  • Right, but in this case there's not really any problem to be solved per se -- all that's happening is a print, so no results or data are moving through the tree. You just visit nodes if they exist, print, and that's about all there is to it. The only "trick"/"insight" in a basic tree traversal like this is where does the print (or generic visitation code) go to produce different orderings? – ggorlen Sep 06 '21 at 00:38

1 Answers1

1

It follows the same pattern.

The "problem" to solve is printing the tree in the right order. Your code implements pre-order.

Step 1: Solve the problem for the smallest input (say, n = 1)

The smallest problem here is printing an empty tree (root is None). In that case the solution is to print nothing (just return).

Step 2: Given the solution to the same problem of size n = k-1 (k <= N), solve it for n = k.

We can consider k to represent the number of nodes in the current tree, and n to be the number of nodes in a direct subtree of this tree. Printing the left or right subtree is a smaller problem, where n <= k-1 (note the <=).

Printing the tree (in pre-order), consists of printing the root node, and then solving the "problem" for the left and right subtree.


One difference is that the output of the tree is a "side-effect" of this function -- it does not return the result, but outputs it. That means that the one making the recursive call does not need to get back a result to work with. A more pure function would return a string representation of the tree, and the caller would use that in a concatenation with the string it is building, and which it will return to its own caller.

trincot
  • 317,000
  • 35
  • 244
  • 286