I don't believe it's a good idea to have a Map with such a large number of values but if you really want to go ahead with it, you can override LinkedHashMap
and use the BigInteger
class to increase the size.
public class MyLinkedHashMap<K,V> extends LinkedHashMap<K,V> {
BigInteger mySize;
@Override
public V put(K key, V Value) {
if (super.put(key,value) == null) // this will still eventually throw an exception because of the 'size' variable incrementing internally in the HashMap
mySize.add(BigInteger.ONE);
}
@Override
public V remove(Object key) {
if (super.remove(key) != null)
mySize.subtract(BigInteger.ONE);
}
@Override
public int size() {
throw new UnsupportedOperationException();
}
public BigInteger getSize() {
return mySize;
}
}
Note that because you cannot change the return type of the size()
method you must create your own size method and variable to retrieve the size of the map.
Additionally, you may actually need to override HashMap
itself because its put()
method will still increment the existing size
variable int
value and will eventually cause it to go beyond the range of a Java Integer.
Lastly, just to be clear, this is NOT a good idea at all because there are many pitfalls in trying to repurpose an existing data structure in this manner (e.g. forgetting to override other methods that modify/use the size
variable, programmer errors which can harm the validity of the original data structure, or other instance variables/side effects that were never originally intended to handle such large sizes etc.).