I am using the Executors framework(fixed thread pool with unbounded blocking queue) to execute tasks concurrently.
But when I run a load test with about 10000 tasks created, there is a huge build up of heap memory (2.1 GB) with about 3.5 million Executable objects.
I am not sure if the unbounded queue is causing this build up.
Memory Analyzer report :
One instance of "java.util.concurrent.ThreadPoolExecutor" loaded by "" occupies 2,299,506,584 (94.97%) bytes. The instance is referenced by com.test.ScheduleBean @ 0x743592b28 , loaded by "org.jboss.modules.ModuleClassLoader @ 0x741b4cc40".
Any pointers appreciated!
//The Executors are loaded in a hashmap
HashMap<String,Executor> poolExecutorMap = new HashMap<String,Executor>();
//Executor is a fixed thread pool one
Executor poolExecutor = Executors.newFixedThreadPool(threadCount);
//then add the executor to the hashmap
poolExecutorMap.put("Executor", poolExecutor);
//then a list of tasks are pulled from a database and passed as runnable objects to the executors
Class<?> monitorClass=null;
List<Task> list = getAllTasksToProcess();
for (int i = 0; i < list.size(); i++) {
Task task = list.get((int) i);
monitorClass = Class.forName(task.getTask_event_name());
Constructor<?> ctor;
ctor = monitorClass.getConstructor(Task.class);
Object object = ctor.newInstance(task);
logger.debug("Adding task number : "+task.getTask_sequence_id());
poolExecutorMap.get("Executor").execute((Runnable) object);
}
// the executor classes have an execute method which sends a http notification.