I have the following code that returns the number of nodes in a tree when a complete Binary Tree is layer
layers tall:
public static long nNodesUpToLayer(int layer) {
if (layer < 0) throw new IllegalArgumentException(
"The layer number must be positive: " + layer );
//At layer 0, there must be 1 node; the root.
if (layer == 0) return 1;
//Else, there will be 1 + 2 * (the number of nodes in the previous layer) nodes.
return 1 + (2 * nNodesUpToLayer(layer - 1));
The odd thing is, when I input 63
into the function (the minimum value that produces this), it gives me back -1
. At 62
, it gives back 9223372036854775807
, so this seems to be caused by an overflow.
Shouldn't it give me back the minimum value of Java's long + the amount that it was overflowed by? Regardless of the input that I give it though (passed 62
), it will always return -1
instead of a seemingly random number that I'd expect from an overflow.
I'm not entirely sure how to debug this, since it's recursive, and the value I'm interested in will only be evaluated once the the function has reached the base-case.