I have the below code:
import java.math.BigInteger;
import java.util.HashMap;
import java.util.Map;
public class DynamicFib
{
private static Map<Integer, BigInteger> myMap = new HashMap<>();
static {
myMap.put(0, BigInteger.ZERO); //fibonacci(0)
myMap.put(1, BigInteger.ONE); //fibonacci(1)
}
public static BigInteger fibonacci(int x)
{
// System.out.println("x = [" + x + "]");
return myMap.computeIfAbsent(x, n -> fibonacci(n - 2).add(fibonacci(n - 1)));
}
public static void main(String[] args)
{
System.out.println("l = " + fibonacci(25));
System.out.println("myMap = " + myMap);
System.out.println("myMap = " + myMap.keySet().size());
}
}
console output:
l = 75025
myMap = {0=0, 1=1, 2=1, 3=2, 4=3, 5=5, 6=8, 7=13, 8=21, 9=34, 10=55, 11=89, 12=144, 13=233, 14=377, 15=610, 16=987, 17=1597, 18=2584, 19=4181, 20=6765, 21=10946, 22=17711, 23=28657, 24=46368}
myMap = 31
memo has just 25 elements but size returns 31. How? is it a bug in hash map implementation?
I changed hashmap to ConcurrentHashMap, it simply hangs if I ask for 9th or more fibonacci number.
But this works and returns fibanocci number correctly even for 1000!