15

I need to process messages in parallel, but preserve the processing order of messages with the same conversation ID.

Example:
Let's define a Message like this:

class Message {
    Message(long id, long conversationId, String someData) {...}
}

Suppose the messages arrive in the following order:
Message(1, 1, "a1"), Message(2, 2, "a2"), Message(3, 1, "b1"), Message(4, 2, "b2").

I need the message 3 to be processed after the message 1, since messages 1 and 3 have the same conversation ID (similarly, the message 4 should be processed after 2 by the same reason).
I don't care about the relative order between e.g. 1 and 2, since they have different conversation IDs.

I would like to reuse the java ThreadPoolExecutor's functionality as much as possible to avoid having to replace dead threads manually in my code etc.

Update: The number of possible 'conversation-ids' is not limited, and there is no time limit on a conversation. (I personally don't see it as a problem, since I can have a simple mapping from a conversationId to a worker number, e.g. conversationId % totalWorkers).

Update 2: There is one problem with a solution with multiple queues, where the queue number is determined by e.g. 'index = Objects.hash(conversationId) % total': if it takes a long time to process some message, all messages with the same 'index' but different 'conversationId' will wait even though other threads are available to handle it. That is, I believe solutions with a single smart blocking queue would be better, but it's just an opinion, I am open to any good solution.

Do you see an elegant solution for this problem?

Alexander
  • 2,761
  • 1
  • 28
  • 33
  • 1
    tried to create a Executors.newSingleThreadExecutor() upon each unique conversationId? – Vitaliy Moskalyuk Jun 02 '17 at 11:21
  • Might certainly be an option - to have an array of single-thread ThreadPoolExecutor objects, and to submit a message to executor[conversationId % total] – Alexander Jun 02 '17 at 11:28
  • But of course that means that you keep piling up those executors. When do you know that you can discard a pool? Maybe that is an important thing to understand in the first place: do you have any notion of the "lifetime" of "conversation ids"? – GhostCat Jun 02 '17 at 11:30
  • Thanks for noting. Let me clarify the question. – Alexander Jun 02 '17 at 11:31
  • What if message your pool picks up message with conversationId 1 and then 2 but processing of 2 finishes earlier than 1. Is this behaviour allowed? – Sneh Jun 02 '17 at 12:10
  • There is absolutely no restriction on either execution or completion order of messages 1 and 2 since they have different conversation IDs – Alexander Jun 02 '17 at 12:12
  • Instead of one Executor for each conversation thread, you could have a small fixed number of them, And you could choose the Executor for any given conversation thread by hashing the conversationId. – Solomon Slow Jun 02 '17 at 13:08
  • This might be an option, but it has its drawbacks. It seems awkward. You have to drag the worker's queue into its executor service, since you don't want to loose messages if a new thread is created by the thread pool instead of a dead thread. Also, pool of pools seems like an unnecessary complication... – Alexander Jun 02 '17 at 14:05
  • how do you receive the messages? Do you have a timeout on arrival? Eg when Message 3 arrives before Message 1, how long would you wait for Message 1 to arrive, do you have any guarantees that the message is not lost and you wait forever? – Thomas Jungblut Jun 09 '17 at 07:47
  • No guarantees. All the messages just arrive at random. The solution should not assume any business logic between the messages. It is possible, for example, that some conversation will only have a single message, while another conversation will have 10 messages. – Alexander Jun 09 '17 at 07:56
  • 1
    Hi @Alexander, you OK man? You were keen with giving feedback to the first answers but fell silent. So, which one should I copy-paste? Quick I need it to efficiently send a ton of spam to many targets, in the right order per-target ;) – qlown Jun 21 '17 at 07:03

8 Answers8

10

I had to do something very similar some time ago, so here is an adaptation.

(See it in action online)

It's actually the exact same base need, but in my case the key was a String, and more importantly the set of keys was not growing indefinitely, so here I had to add a "cleanup scheduler". Other than that it's basically the same code, so I hope I have not lost anything serious in the adaptation process. I tested it, looks like it works. It's longer than other solutions, though, perhaps more complex...

Base idea:

  • MessageTask wraps a message into a Runnable, and notifies queue when it is complete
  • ConvoQueue: blocking queue of messages, for a conversation. Acts as a prequeue that guarantees desired order. See this trio in particular: ConvoQueue.runNextIfPossible()MessageTask.run()ConvoQueue.complete() → …
  • MessageProcessor has a Map<Long, ConvoQueue>, and an ExecutorService
  • messages are processed by any thread in the executor, the ConvoQueues feed the ExecutorService and guarantee message order per convo, but not globally (so a "difficult" message will not block other conversations from being processed, unlike some other solutions, and that property was critically important in our case -- if it's not that critical for you, maybe a simpler solution is better)
  • cleanup with ScheduledExecutorService (takes 1 thread)

Visually:

   ConvoQueues              ExecutorService's internal queue
                            (shared, but has at most 1 MessageTask per convo)
Convo 1   ########   
Convo 2      #####   
Convo 3    #######                        Thread 1
Convo 4              } →    ####    → {
Convo 5        ###                        Thread 2
Convo 6  #########   
Convo 7      #####   

(Convo 4 is about to be deleted)

Below all the classes (MessageProcessorTest can be executed directly):

// MessageProcessor.java
import java.util.*;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;

import static java.util.concurrent.TimeUnit.SECONDS;

public class MessageProcessor {

    private static final long CLEANUP_PERIOD_S = 10;
    private final Map<Long, ConvoQueue> queuesByConvo = new HashMap<>();
    private final ExecutorService executorService;

    public MessageProcessor(int nbThreads) {
        executorService = Executors.newFixedThreadPool(nbThreads);
        ScheduledExecutorService cleanupScheduler = Executors.newScheduledThreadPool(1);
        cleanupScheduler.scheduleAtFixedRate(this::removeEmptyQueues, CLEANUP_PERIOD_S, CLEANUP_PERIOD_S, SECONDS);
    }

    public void addMessageToProcess(Message message) {
        ConvoQueue queue = getQueue(message.getConversationId());
        queue.addMessage(message);
    }

    private ConvoQueue getQueue(Long convoId) {
        synchronized (queuesByConvo) {
            return queuesByConvo.computeIfAbsent(convoId, p -> new ConvoQueue(executorService));
        }
    }

    private void removeEmptyQueues() {
        synchronized (queuesByConvo) {
            queuesByConvo.entrySet().removeIf(entry -> entry.getValue().isEmpty());
        }
    }

}


// ConvoQueue.java
import java.util.Queue;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.LinkedBlockingQueue;

class ConvoQueue {

    private Queue<MessageTask> queue;
    private MessageTask activeTask;
    private ExecutorService executorService;

    ConvoQueue(ExecutorService executorService) {
        this.executorService = executorService;
        this.queue = new LinkedBlockingQueue<>();
    }

    private void runNextIfPossible() {
        synchronized(this) {
            if (activeTask == null) {
                activeTask = queue.poll();
                if (activeTask != null) {
                    executorService.submit(activeTask);
                }
            }
        }
    }

    void complete(MessageTask task) {
        synchronized(this) {
            if (task == activeTask) {
                activeTask = null;
                runNextIfPossible();
            }
            else {
                throw new IllegalStateException("Attempt to complete task that is not supposed to be active: "+task);
            }
        }
    }

    boolean isEmpty() {
        return queue.isEmpty();
    }

    void addMessage(Message message) {
        add(new MessageTask(this, message));
    }

    private void add(MessageTask task) {
        synchronized(this) {
            queue.add(task);
            runNextIfPossible();
        }
    }

}

// MessageTask.java
public class MessageTask implements Runnable {

    private ConvoQueue convoQueue;
    private Message message;

    MessageTask(ConvoQueue convoQueue, Message message) {
        this.convoQueue = convoQueue;
        this.message = message;
    }

    @Override
    public void run() {
        try {
            processMessage();
        }
        finally {
            convoQueue.complete(this);
        }
    }

    private void processMessage() {
        // Dummy processing with random delay to observe reordered messages & preserved convo order
        try {
            Thread.sleep((long) (50*Math.random()));
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        System.out.println(message);
    }

}

// Message.java
class Message {

    private long id;
    private long conversationId;
    private String data;

    Message(long id, long conversationId, String someData) {
        this.id = id;
        this.conversationId = conversationId;
        this.data = someData;
    }

    long getConversationId() {
        return conversationId;
    }

    String getData() {
        return data;
    }

    public String toString() {
        return "Message{" + id + "," + conversationId + "," + data + "}";
    }
}

// MessageProcessorTest.java
public class MessageProcessorTest {
    public static void main(String[] args) {
        MessageProcessor test = new MessageProcessor(2);
        for (int i=1; i<100; i++) {
            test.addMessageToProcess(new Message(1000+i,i%7,"hi "+i));
        }
    }
}

Output (for each convo ID (2nd field) order is preserved):

Message{1002,2,hi 2}
Message{1001,1,hi 1}
Message{1004,4,hi 4}
Message{1003,3,hi 3}
Message{1005,5,hi 5}
Message{1006,6,hi 6}
Message{1009,2,hi 9}
Message{1007,0,hi 7}
Message{1008,1,hi 8}
Message{1011,4,hi 11}
Message{1010,3,hi 10}
...
Message{1097,6,hi 97}
Message{1095,4,hi 95}
Message{1098,0,hi 98}
Message{1099,1,hi 99}
Message{1096,5,hi 96}

Test above provided me confidence to share it, but I'm slightly worried that I might have forgotten details for pathological cases. It has been running in production for years without hitches (although with more code that allows to inspect it live when we need to see what's happening, why a certain queue takes time, etc -- never a problem with the system above in itself, but sometimes with the processing of a particular task)

Edit: click here to test online. Alternative: copy that gist in there, and press "Compile & Execute".

Hugues M.
  • 19,846
  • 6
  • 37
  • 65
4

Not sure how you want messages to be processed. For convenience each message is of type Runnable, which is the place for execution to take place.

The solution to all of this is to have a number of Executor's which are submit to a parallel ExecutorService. Use the modulo operation to calculate to which Executor the incoming message needs to be distributed to. Obviously, for the same conversation id its the same Executor, hence you have parallel processing but sequential for the same conversation id. It's not guaranteed that messages with different conversation id's will always execute in parallel (all in all, you are bounded, at least, by the number of physical cores in your system).

public class MessageExecutor {

    public interface Message extends Runnable {

        long getId();

        long getConversationId();

        String getMessage();

    }

    private static class Executor implements Runnable {

        private final LinkedBlockingQueue<Message> messages = new LinkedBlockingQueue<>();

        private volatile boolean stopped;

        void schedule(Message message) {
            messages.add(message);
        }

        void stop() {
            stopped = true;
        }

        @Override
        public void run() {
            while (!stopped) {
                try {
                    Message message = messages.take();
                    message.run();
                } catch (Exception e) {
                    System.err.println(e.getMessage());
                }
            }
        }
    }

    private final Executor[] executors;
    private final ExecutorService executorService;

    public MessageExecutor(int poolCount) {
        executorService = Executors.newFixedThreadPool(poolCount);
        executors = new Executor[poolCount];

        IntStream.range(0, poolCount).forEach(i -> {
            Executor executor = new Executor();
            executorService.submit(executor);
            executors[i] = executor;
        });
    }

    public void submit(Message message) {
        final int executorNr = Objects.hash(message.getConversationId()) % executors.length;
        executors[executorNr].schedule(message);
    }

    public void stop() {
        Arrays.stream(executors).forEach(Executor::stop);
        executorService.shutdown();
    }
}

You can then start the message executor with a pool ammount and submit messages to it.

public static void main(String[] args) {
    MessageExecutor messageExecutor = new MessageExecutor(Runtime.getRuntime().availableProcessors());
    messageExecutor.submit(new Message() {
        @Override
        public long getId() {
            return 1;
        }

        @Override
        public long getConversationId() {
            return 1;
        }

        @Override
        public String getMessage() {
            return "abc1";
        }

        @Override
        public void run() {
            System.out.println(this.getMessage());
        }
    });
    messageExecutor.submit(new Message() {
        @Override
        public long getId() {
            return 1;
        }

        @Override
        public long getConversationId() {
            return 2;
        }

        @Override
        public String getMessage() {
            return "abc2";
        }

        @Override
        public void run() {
            System.out.println(this.getMessage());
        }
    });
    messageExecutor.stop();
}

When I run with a pool count of 2 and submit an amount of messages:

Message with conversation id [1] is scheduled on scheduler #[0]
Message with conversation id [2] is scheduled on scheduler #[1]
Message with conversation id [3] is scheduled on scheduler #[0]
Message with conversation id [4] is scheduled on scheduler #[1]
Message with conversation id [22] is scheduled on scheduler #[1]
Message with conversation id [22] is scheduled on scheduler #[1]
Message with conversation id [22] is scheduled on scheduler #[1]
Message with conversation id [22] is scheduled on scheduler #[1]
Message with conversation id [1] is scheduled on scheduler #[0]
Message with conversation id [2] is scheduled on scheduler #[1]
Message with conversation id [3] is scheduled on scheduler #[0]
Message with conversation id [3] is scheduled on scheduler #[0]
Message with conversation id [4] is scheduled on scheduler #[1]

When the same amount of messages runs with a pool count of 3:

Message with conversation id [1] is scheduled on scheduler #[2]
Message with conversation id [2] is scheduled on scheduler #[0]
Message with conversation id [3] is scheduled on scheduler #[1]
Message with conversation id [4] is scheduled on scheduler #[2]
Message with conversation id [22] is scheduled on scheduler #[2]
Message with conversation id [22] is scheduled on scheduler #[2]
Message with conversation id [22] is scheduled on scheduler #[2]
Message with conversation id [22] is scheduled on scheduler #[2]
Message with conversation id [1] is scheduled on scheduler #[2]
Message with conversation id [2] is scheduled on scheduler #[0]
Message with conversation id [3] is scheduled on scheduler #[1]
Message with conversation id [3] is scheduled on scheduler #[1]
Message with conversation id [4] is scheduled on scheduler #[2]

Messages get distributed nicely among the pool of Executor's :).

EDIT: the Executor's run() is catching all Exceptions, to ensure it does not break when one message is failing.

Velth
  • 1,108
  • 3
  • 15
  • 29
  • Good possible solution. Yet it still has problems. If 'message.run()' throws an exception then the Executor's thread will die, and all the messages in its queue will be lost. – Alexander Jun 02 '17 at 12:16
  • To be sure, you can always just catch all exceptions. Or make sure the run() of your message doesn't break the Executor but throwing unexpected exceptions. – Velth Jun 02 '17 at 12:18
  • Not sure it's a good solution to just catch everything in the runnable. Certainly you don't want to catch throwables etc. If I can catch everything I don't need the ThreadPoolExecutor at all - I could keep Thread[] instead of Executor[] array – Alexander Jun 02 '17 at 12:35
  • Worse, if your 'Executor.run()' gets a Throwable, it will just silently stop. The ExecutorService will not restart it. – Alexander Jun 05 '17 at 06:50
  • 2
    @Alexander I think you misunderstand the guarantees of the `ExecutorService`. There is no guarantee that the `Callable` is rescheduled if it dies from an exception. It however does manage threads and multiplexing the `Callable` onto them, with a possibility of rescheduling threads if they somehow die. If you need a continuously running computation `Callable`/`Runnable` you have to catch all exceptions, preferably `Throwable`. – Thomas Jungblut Jun 09 '17 at 07:39
  • 1
    @Alexander also generally providing linearization guarantees on a distributed system is hard. Velth's solution provides a good trade-off between scalability and simplicity of the solution. – Thomas Jungblut Jun 09 '17 at 07:43
  • I agree that the ExecutorService will not restart the Runnable, but will keep the tread. I am just saying that looping inside the Runnable and having a queue tied to this Runnable requires you to catch all Throwable exception, which is not considered a good option. – Alexander Jun 09 '17 at 08:01
3

You essentially want the work to be done sequentially within a conversation. One solution would be to synchronize on a mutex that is unique to that conversation. The drawback of that solution is that if conversations are short lived and new conversations start on a frequent basis, the "mutexes" map will grow fast.

For brevity's sake I've omitted the executor shutdown, actual message processing, exception handling etc.

public class MessageProcessor {

  private final ExecutorService executor;
  private final ConcurrentMap<Long, Object> mutexes = new ConcurrentHashMap<> ();

  public MessageProcessor(int threadCount) {
    executor = Executors.newFixedThreadPool(threadCount);
  }

  public static void main(String[] args) throws InterruptedException {
    MessageProcessor p = new MessageProcessor(10);
    BlockingQueue<Message> queue = new ArrayBlockingQueue<> (1000);

    //some other thread populates the queue

    while (true) {
      Message m = queue.take();
      p.process(m);
    }
  }

  public void process(Message m) {
    Object mutex = mutexes.computeIfAbsent(m.getConversationId(), id -> new Object());
    executor.submit(() -> {
      synchronized(mutex) {
        //That's where you actually process the message
      }
    });
  }
}
assylias
  • 321,522
  • 82
  • 660
  • 783
  • I like your solution, it is simple and elegant, but as it stands now it has no chance to be in production even with exception handling etc. added. My main issue is the fact that the 'mutexes' map grows indefinitely. You naturally can't just remove the entry at the end of 'process()'. I guess it can be handled somehow, may be with AtomicInteger instead of Object, but this a first complication. Second, a heavy conversation can take all my threads waiting even though I have CPUs available to process other conversations. – Alexander Jun 09 '17 at 20:09
  • It is a simple and elegant solution. The other concern I have is that it may tie up other processing threads while the message in the same conversation is still processing. This may be fine but can hurt throughput. – John Vint Jun 12 '17 at 18:39
  • You could maybe solve the "indefinite growing" issue with [this](https://stackoverflow.com/a/27809294/829571). I haven't used it before so can't tell for sure. – assylias Jun 13 '17 at 07:59
3

I had a similar problem in my application. My first solution was sorting them using a java.util.ConcurrentHashMap. So in your case, this would be a ConcurrentHashMap with conversationId as key and a list of messages as value. The problem was that the HashMap got too big taking too much space.

My current solution is the following: One Thread receives the messages and stores them in a java.util.ArrayList. After receiving N messages it pushes the list to a second thread. This thread sorts the messages using the ArrayList.sort method using conversationId and id. Then the thread iterates through the sorted list and searches for blocks wich can be processed. Each block which can be processed is taken out of the list. To process a block you can create a runnable with this block and push this to an executor service. The messages which could not be processed remain in the list and will be checked in the next round.

Thomas Krieger
  • 1,607
  • 11
  • 12
  • It might wait for too long until N messages are received, i.e. even N=2 might be too big. Ideally, each message is processed immediately if another message with the same conversation id is not being processed. – Alexander Jun 13 '17 at 08:59
  • Set it to one :-). Serious, it depends on what do you want to achieve. If you have only low traffic but you need to reduce the latency set N to one, process it immediately. If on the other hand, you need to improve the throughput as in my case use a higher N, something like 100. – Thomas Krieger Jun 13 '17 at 09:35
3

For what it's worth, the Kafka Streams API provides most of this functionality. Partitions preserve ordering. It's a larger buy-in than an ExecutorService but could be interesting, especially if you happen to use Kafka already.

Steven Schlansker
  • 37,580
  • 14
  • 81
  • 100
  • This is a massive buy-in if you don't already use Kafka, but interesting (upvoted). I think OP is after a self-contained, single-JVM solution. If a library implemented this, that would be ideal (for me as well, as I have a similar thingy in production, see my answer). It does not look like Kafka Streams layer is generic enough to be used outside Kafka, right? (I mean it consumes Kafka streams and produces other Kafka streams... ?) – Hugues M. Jun 15 '17 at 10:07
2

I would use three executorServices (one for receiving messages, one for sorting messages, one for processing messages).

I would also use one queue to put all messages received and another queue with messages sorted and grouped (sorted by ConversationID, then make groups of messages that share the same ConversationID).

Finally: one thread for receiving messages, one thread for sorting messages and all remaining threads used for processing messages.

see below:

import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import java.util.NoSuchElementException;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.stream.Collectors;



public class MultipleMessagesExample {


    private static int MAX_ELEMENTS_MESSAGE_QUEUE = 1000;
    private BlockingQueue<Message> receivingBlockingQueue = new LinkedBlockingDeque<>(MAX_ELEMENTS_MESSAGE_QUEUE);
    private BlockingQueue<List<Message>> prioritySortedBlockingQueue = new LinkedBlockingDeque<>(MAX_ELEMENTS_MESSAGE_QUEUE);

    public static void main(String[] args) {


        MultipleMessagesExample multipleMessagesExample = new MultipleMessagesExample();
        multipleMessagesExample.doTheWork();

    }

    private void doTheWork() {
        int totalCores = Runtime.getRuntime().availableProcessors();
        int totalSortingProcesses = 1;
        int totalMessagesReceiverProcess = 1;
        int totalMessagesProcessors = totalCores - totalSortingProcesses - totalMessagesReceiverProcess;

        ExecutorService messagesReceiverExecutorService = Executors.newFixedThreadPool(totalMessagesReceiverProcess);
        ExecutorService sortingExecutorService = Executors.newFixedThreadPool(totalSortingProcesses);
        ExecutorService messageProcessorExecutorService = Executors.newFixedThreadPool(totalMessagesProcessors);

        MessageReceiver messageReceiver = new MessageReceiver(receivingBlockingQueue);
        messagesReceiverExecutorService.submit(messageReceiver);

        MessageSorter messageSorter = new MessageSorter(receivingBlockingQueue, prioritySortedBlockingQueue);
        sortingExecutorService.submit(messageSorter);


        for (int i = 0; i < totalMessagesProcessors; i++) {
            MessageProcessor messageProcessor = new MessageProcessor(prioritySortedBlockingQueue);
            messageProcessorExecutorService.submit(messageProcessor);
        }

    }
}


class Message {
    private Long id;
    private Long conversationId;
    private String someData;

    public Message(Long id, Long conversationId, String someData) {
        this.id = id;
        this.conversationId = conversationId;
        this.someData = someData;
    }


    public Long getId() {
        return id;
    }

    public Long getConversationId() {
        return conversationId;
    }

    public String getSomeData() {
        return someData;
    }
}

class MessageReceiver implements Callable<Void> {
    private BlockingQueue<Message> bloquingQueue;


    public MessageReceiver(BlockingQueue<Message> bloquingQueue) {
        this.bloquingQueue = bloquingQueue;
    }

    @Override
    public Void call() throws Exception {
        System.out.println("receiving messages...");

        bloquingQueue.add(new Message(1L, 1000L, "conversation1 data fragment 1"));
        bloquingQueue.add(new Message(2L, 2000L, "conversation2 data fragment 1"));
        bloquingQueue.add(new Message(3L, 1000L, "conversation1 data fragment 2"));
        bloquingQueue.add(new Message(4L, 2000L, "conversation2 data fragment 2"));

        return null;
    }
}

/**
 * sorts messages. group together same conversation IDs
 */
class MessageSorter implements Callable<Void> {

    private BlockingQueue<Message> receivingBlockingQueue;
    private BlockingQueue<List<Message>> prioritySortedBlockingQueue;
    private List<Message> intermediateList = new ArrayList<>();
    private MessageComparator messageComparator = new MessageComparator();


    private static int BATCH_SIZE = 10;

    public MessageSorter(BlockingQueue<Message> receivingBlockingQueue, BlockingQueue<List<Message>> prioritySortedBlockingQueue) {
        this.receivingBlockingQueue = receivingBlockingQueue;
        this.prioritySortedBlockingQueue = prioritySortedBlockingQueue;

    }

    @Override
    public Void call() throws Exception {

        while (true) {
            boolean messagesReceivedQueueIsEmpty = false;
            intermediateList = new ArrayList<>();
            for (int i = 0; i < BATCH_SIZE; i++) {
                try {
                    Message message = receivingBlockingQueue.remove();
                    intermediateList.add(message);
                } catch (NoSuchElementException e) {
                    // this is expected when queue is empty
                    messagesReceivedQueueIsEmpty = true;
                    break;
                }

            }
            Collections.sort(intermediateList, messageComparator);

            if (intermediateList.size() > 0) {
                Map<Long, List<Message>> map = intermediateList.stream().collect(Collectors.groupingBy(message -> message.getConversationId()));
                map.forEach((k, v) -> prioritySortedBlockingQueue.add(new ArrayList<>(v)));
                System.out.println("new batch of messages was sorted and is ready to be processed");
            }

            if (messagesReceivedQueueIsEmpty) {
                System.out.println("message processor is waiting for messages...");
                Thread.sleep(1000);  // no need to use CPU if there are no messages to process
            }
        }
    }

}


/**
 * process groups of messages with same conversationID
 */
class MessageProcessor implements Callable<Void> {

    private BlockingQueue<List<Message>> prioritySortedBlockingQueue;

    public MessageProcessor(BlockingQueue<List<Message>> prioritySortedBlockingQueue) {
        this.prioritySortedBlockingQueue = prioritySortedBlockingQueue;
    }

    @Override
    public Void call() throws Exception {
        while (true) {
            List<Message> messages = prioritySortedBlockingQueue.take();  // blocks if no message is available
            messages.stream().forEach(m -> processMessage(m));
        }
    }

    private void processMessage(Message message) {
        System.out.println(message.getId() + " - " + message.getConversationId() + " - " + message.getSomeData());
    }
}


class MessageComparator implements Comparator<Message> {

    @Override
    public int compare(Message o1, Message o2) {
        return (int) (o1.getConversationId() - o2.getConversationId());
    }
}
Jose Zevallos
  • 685
  • 4
  • 3
1

create a executor class extending Executor.On submit you can put code like below.

public void execute(Runnable command) {

        final int key= command.getKey();
         //Some code to check if it is runing
        final int index = key != Integer.MIN_VALUE ? Math.abs(key) % size : 0;
        workers[index].execute(command);
    }

Create worker with queue so that if you want some task required sequentially then run.

private final AtomicBoolean scheduled = new AtomicBoolean(false);

private final BlockingQueue<Runnable> workQueue = new LinkedBlockingQueue<Runnable>(maximumQueueSize);

public void execute(Runnable command) {
    long timeout = 0;
    TimeUnit timeUnit = TimeUnit.SECONDS;
    if (command instanceof TimeoutRunnable) {
        TimeoutRunnable timeoutRunnable = ((TimeoutRunnable) command);
        timeout = timeoutRunnable.getTimeout();
        timeUnit = timeoutRunnable.getTimeUnit();
    }

    boolean offered;
    try {
        if (timeout == 0) {
            offered = workQueue.offer(command);
        } else {
            offered = workQueue.offer(command, timeout, timeUnit);
        }
    } catch (InterruptedException e) {
        throw new RejectedExecutionException("Thread is interrupted while offering work");
    }

    if (!offered) {
        throw new RejectedExecutionException("Worker queue is full!");
    }

    schedule();
}

private void schedule() {
    //if it is already scheduled, we don't need to schedule it again.
    if (scheduled.get()) {
        return;
    }

    if (!workQueue.isEmpty() && scheduled.compareAndSet(false, true)) {
        try {
            executor.execute(this);
        } catch (RejectedExecutionException e) {
            scheduled.set(false);
            throw e;
        }
    }
}

public void run() {
    try {
        Runnable r;
        do {
            r = workQueue.poll();
            if (r != null) {
                r.run();
            }
        }
        while (r != null);
    } finally {
        scheduled.set(false);
        schedule();
    }
}
gati sahu
  • 2,576
  • 2
  • 10
  • 16
1

This library should help: https://github.com/jano7/executor

ExecutorService underlyingExecutor = Executors.newCachedThreadPool();
KeySequentialRunner<String> runner = new KeySequentialRunner<>(underlyingExecutor);

Message message = retrieveMessage();

Runnable task = new Runnable() {
    @Override
    public void run() {
        // process the message
    }
};

runner.run(message.conversationId, task);

Estok
  • 21
  • 2