37

I'm using java.util.concurrent's Executors class to create a fixed thread pool for running request handlers for a web server:

static ExecutorService  newFixedThreadPool(int nThreads) 

and the description is:

Creates a thread pool that reuses a fixed set of threads operating off a shared unbounded queue.

However, I am looking for thread pool implementation which will do the exact same thing, except with a bounded queue. Is there such an implementation? Or do I need to implement my own wrapper for the fixed thread pool?

Amir Rachum
  • 76,817
  • 74
  • 166
  • 248

4 Answers4

47

What you want to do is new your own ExecutorService, probably using ThreadPoolExecutor. ThreadPoolExecutor has a constructor which takes a BlockingQueue and to get a bounded queue you use for example ArrayBlockingQueue properly constructed for bounding. You can also include a RejectedExecutionHandler in order to determine what to do when your queue is full, or hang on to a reference to the blocking queue and use the offer methods.

Here's a mini example:

BlockingQueue<Runnable> linkedBlockingDeque = new LinkedBlockingDeque<Runnable>(
    100);
ExecutorService executorService = new ThreadPoolExecutor(1, 10, 30,
    TimeUnit.SECONDS, linkedBlockingDeque,
    new ThreadPoolExecutor.CallerRunsPolicy());
Duncan Jones
  • 67,400
  • 29
  • 193
  • 254
lscoughlin
  • 2,327
  • 16
  • 23
  • 5
    Do you know why there isn't a "PostingBlocksPolicy" which blocks until the task _can_ be posted? I'd like to ensure the job is (eventually) done, so none of the discard or abort policies work, and CallerRuns doesn't fly because my whole objective in using a (single-threaded)threadpool is to ensure that work is done on a particular, single thread. – bacar Nov 19 '14 at 16:24
  • 1
    There isn't really, and it would be what one might call "awkward" if you were to implement one because the caller would never have control of the waits. What you can do however, is call linkedBlockingDequeue.offer(x) directly and take some action on the result. while( !linkedBlockingDequa.offer( myRunnable ) ) { // wait, or take some action, or throw an exception } – lscoughlin Nov 20 '14 at 13:12
  • If you have 2 tasks which are blocking - hung and two threads in the pool, if you set queue size as 5 , and add one more task, that task will take a very long time to execute or if two threads are hung, maybe never – Alex Punnen Aug 27 '16 at 06:53
  • @AlexPunen Yes. And? There are ways to manage that sort of scenario but it's outside of the scope of the question asked. If you yourself are having that problem post the issue yourself, i'm sure SO will provide :P – lscoughlin Aug 29 '16 at 15:04
7

Create a ThreadPoolexecutor and pass suitable BlockingQueue implementation in it. for e.g. you can pass in a ArrayBlockingQueue in the ThreadPoolExecutor constructor to get the desired effect.

Community
  • 1
  • 1
Suraj Chandran
  • 24,433
  • 12
  • 63
  • 94
6

I've solved this with a Semaphore which I use to throttle tasks being submitted to the ExecutorService.

Eg:

int threadCount = 10;
ExecutorService consumerPool = Executors.newFixedThreadPool(threadCount);

// set the permit count greater than thread count so that we 
// build up a limited buffer of waiting consumers
Semaphore semaphore = new Semaphore(threadCount * 100); 

for (int i = 0; i < 1000000; ++i) {
    semaphore.acquire(); // this might block waiting for a permit 
    Runnable consumer = () -> {
       try {
          doSomeWork(i);
       } finally {
          semaphore.release(); // release a permit 
       }
    };
    consumerPool.submit(consumer);
}
lance-java
  • 25,497
  • 4
  • 59
  • 101
  • 1
    This solution may lead to permit leak. For example, if the future is cancelled before it is effectively executed, semaphore will never be released and one permit is lost forever. The same problem occurs if the task is rejected by the ExecutorService. – Ildar Faizov Apr 29 '21 at 14:58
  • What if the release was placed outside the consumer? say you do: `CompletableFuture.supplyAsync(() -> doSomeWork(i), consumerPool).whenCompleteAsync((i, err)->semaphore.release(), consumerPool)`? – JSelser Feb 22 '23 at 22:31
4

When you create a ThreadPoolExecutor you can give it a bounded BlockingQueue and a RejectedExecutionHandler so you can control what happens when the limit is reached. The default behaviour is to throw a RejectedExecutionException.

You can also define you own thread factory to control the thread names and make them daemon threads.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130