I'm storing a large number of objects (with unique combinations of values stored in a byte array in the object) in a hashmap (~2.8million objects) and, when checking if I have any collision of hash code (32-bit hash), I'm very surprised to see there is none while statistically, I have nearly 100% chances of having at least one collision (cf. http://preshing.com/20110504/hash-collision-probabilities/).
I am thus wondering if my approach to detect collisions is bugged or if I'm extremely lucky...
Here is how I try to detect collisions from the 2.8million values stored in the map:
HashMap<ShowdownFreqKeysVO, Double> values;
(...fill with 2.8 mlns unique values...)
HashSet<Integer> hashes = new HashSet<>();
for (ShowdownFreqKeysVO key:values.keySet()){
if (hashes.contains(key.hashCode())) throw new RuntimeException("Duplicate hash for:"+key);
hashes.add(key.hashCode());
}
And here is the object's approach to create a hash value:
public class ShowdownFreqKeysVO {
//Values for the different parameters
public byte[] values = new byte[12];
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + Arrays.hashCode(values);
return result;
}
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
ShowdownFreqKeysVO other = (ShowdownFreqKeysVO) obj;
if (!Arrays.equals(values, other.values))
return false;
return true;
}
}
Any idea/hint on what I'm doing wrong would be greatly appreciated !
Thanks, Thomas