0

CLARIFICATION: Notice that this question is different form this one: How to implement a microservice Event Driven architecture with Spring Cloud Stream Kafka and Database per service

This one is about using Kafka as the only repository (of events), no DB needed, The other one is about using a Database (MariaDB) per service + Kafka.

I would like to implement an Event Sourcing architecture to handle distributed transactions:

OrdersService <------------> | Kafka Event Store | <------------>PaymentsService
                subscribe/                           subscribe/
                   find                                 find

OrdersService receives an order request and stores the new Order in the broker.

private OrderBusiness orderBusiness;    

@PostMapping
public Order createOrder(@RequestBody Order order){
    logger.debug("createOrder()");
    //do whatever
    //Publish the new Order with state = pending
    order.setState(PENDING);
    try{       
       orderSource.output().send(MessageBuilder.withPayload(order).build());
    }catch(Exception e){
        logger.error("{}", e);
    }
    return order;
}

This is my main doubt: how can I query a Kafka broker? Imagine I want to search for orders by user/date,state, etc.

Community
  • 1
  • 1
codependent
  • 23,193
  • 31
  • 166
  • 308
  • Possible duplicate of [How to implement a microservice Event Driven architecture with Spring Cloud Stream Kafka and Database per service](http://stackoverflow.com/questions/42140285/how-to-implement-a-microservice-event-driven-architecture-with-spring-cloud-stre) – spencergibb Feb 09 '17 at 16:55
  • Hi Spencer, I don't think it's a duplicate. The other one is about using Database (MariaDB) per service + Kafka. This one is about using Kafka as the only repository (of events), no DB needed: http://microservices.io/patterns/data/database-per-service.html vs http://microservices.io/patterns/data/event-sourcing.html – codependent Feb 09 '17 at 16:58

1 Answers1

0

Short answer: you cannot query the broker but you could exploit Kafka's Streams API and "Interactive Queries".

Long answer: The access pattern for reading Kafka topics, are linear scans and not random lookups. Of course, you can also reposition at any time via #seek(), but only by offset or time. Also topics are sharded into partitions and data is (by default) hash partitioned by key (data model is key-value pairs). So there is a notion of a key.

However, you can use Kafka's Streams API that allows you to build an app that hold the current state -- base on a Kafka topics that is the ground truth -- as a materialized view (basically a cache). "Interactive Queries" allows you to query this materialized view.

For more details, see this two blog post:

Matthias J. Sax
  • 59,682
  • 7
  • 117
  • 137