I'm creating multiple chronicle maps just to avoid contention between threads. I have 10 threads that need something from the cache. With a single cache I observed continuously increasing putAll() [putting 2016 double[3][2] arrays) with each putAll] times (up to 2.6 seconds). So I put in 10 caches avoiding contention by making keys in a cache never collide with another thread. The GC pauses came out to be as long as 45 seconds compared to ~ 50 ms with a single chronicle map.
private final ChronicleMap<CharSequence, double[][]>[] cache = new ChronicleMap[totalCaches];
for (int i = 0; i < totalCaches; i++) {
try {
cache[i] =
ChronicleMap.of(CharSequence.class, double[][].class)
.entriesPerSegment(1000000)
.averageKeySize(44.0)
.averageValueSize(119.0)
.entries(40320000)
.maxBloatFactor(10.0)
.name(CACHE_NAME.concat(String.valueOf(i)))
.putReturnsNull(true)
.createOrRecoverPersistedTo(
new File(
"/var/opt/cache/"
.concat(CACHE_NAME)
.concat(String.valueOf(i))
.concat(".dat")));
} catch (final IOException e) {
LOGGER.error("GA cache init error", e);
}
}
Another problem is that I tried specifying constantValueBySample with a double[][] object and it threw exception stating value size should be 119 which doesn't make sense.
double[][] sample = new double[][]{{Math.random(), Math.random()},
{Math.random(), Math.random()},
{Math.random(), Math.random()}};
for (int i = 0; i < totalCaches; i++) {
try {
cache[i] =
ChronicleMap.of(CharSequence.class, double[][].class)
.entriesPerSegment(1000000)
.averageKeySize(44.0)
.constantValueSizeBySample(sample)
.entries(40320000)
.maxBloatFactor(10.0)
.name(CACHE_NAME.concat(String.valueOf(i)))
.putReturnsNull(true)
.createOrRecoverPersistedTo(
new File(
"/var/opt/cache/"
.concat(CACHE_NAME)
.concat(String.valueOf(i))
.concat(".dat")));
} catch (final IOException e) {
LOGGER.error("GA cache init error", e);
}
}