I have implemented chronicle map as:
ChronicleMapBuilder
.of(LongValue.class, ExceptionQueueVOInterface.class)
.name(MAP_NAME)
.entries(2_000_000)
.maxBloatFactor(3)
.create();
Map has around 3.5 million records. I am using this map to server as cache for search operation. Many times to search for list of entries we have to iterate over whole map. 3.5 Million records taking 8.7 GB of RAM and my java heap allocated memory is 36 GB. Total RAM in server is 49 GB. When i am iterating map using below code:
map.forEachEntry(entry -> {
if(exportSize.get() != MAX_EXPORT_SIZE && TeamQueueFilter.applyFilter(exceptionFilters, entry.value().get(), queType, uid)){
ExceptionQueue exceptionQueue = getExceptionQueue(entry.value().get());
if(exceptionFilters.isRts23Selected() || exceptionQueue.getMessageType().equals(MessageType.RTS23.toString())){
exceptionQueue.setForsId(Constant.BLANK_STRING);
rts23CaseIdMap.put(String.valueOf(entry.key().get().getValue()), exceptionQueue);
}else {
handleMessage(exceptionQueue, entry.key().get(), caseDetailsList);
}
exportSize.incrementAndGet();
}
});
it always gives me memory error. VM.array size is less than available memory. Any tips here to iterate over this large map. Thanks for help.