1

This is a followup to several other SO questions regarding use of spring-batch and spring-kafka.

Intent:

The intent is to set up a call chain like this (simplified view):

Master invokes slave step:

<master job> -> partitioner (MessageChannelPartitionHandler) +aggregator -> messagingTemplate -> outbound-requests (Channel) -> request-outbound-staging (KafkaProducerMessageHandler) -> kafka

Kafka listener responds to message and fires slave worker step

kafka -> inbound-request-listener (MessageDrivenChannelAdapter) -> inbound-requests (channel) -> worker-container (KafkaMessageListenerContainer) -> stepExecutionRequestHandler <slave step>

Spring batch replies are returned to Kafka

stepExecutionRequestHandler <slave step> -> stepMessagingTemplate -> outbound-replies (Channel) -> reply-outbound-staging (KafkaProducerMessageHandler) -> kafka

Kafka listener returns replies to aggregator and partitioner

kafka -> inbound-replies (MessageDrivenChannelAdapter) -> partitioner (MessageChannelPartitionHandler) +aggregator -> <master job>

History:

After initially working out configuration of spring-integration with kafka, the spring-batch components on the slave side of the process were not finding the steps in when the listener was kicked off.

We refactored the spring-batch components and the spring-integration that drive them, ultimately moving them and the listener components out of the java DSL and into the slave step XML.

Current status:

After the refactoring, the kafka listener does not appear to be responding any more. The only symptom is that there are no responses from the slave process, and the aggregator times out.

Java DSL config:

@Configuration
@Order(6)
@EnableIntegration
@EnableKafka
@IntegrationComponentScan
public class QueueingConfig {
    private static final int MXMODULE = 400;
    private static final String JOB_CONTROL_TOPIC = "job.control";
    private static final String STEP_EXECUTION_TOPICS = "job.step";
    private static final String STEP_REPLY_TOPICS = "job.step.reply";

XML config snips:

<bean id="worker-container" class="org.springframework.kafka.listener.KafkaMessageListenerContainer">
    <constructor-arg>
        <bean class="org.springframework.kafka.core.DefaultKafkaConsumerFactory">
            <constructor-arg>
                <map>
                <entry key="bootstrap.servers" value="192.168.2.127:9092" />  <!-- needs to come from factory bean -->
                <entry key="key.deserializer" value="org.apache.kafka.common.serialization.IntegerDeserializer"/>
                <entry key="value.deserializer" value="org.springframework.kafka.support.serializer.JsonDeserializer"/>
                <entry key="group.id" value="batch"/>
                <entry key="spring.json.trusted.packages" value="com.mypackage,org.springframework.batch.integration.partition"/>
                <entry key="max.poll.records" value="10"/>
                </map>
            </constructor-arg>
        </bean>
    </constructor-arg>
    <constructor-arg>
        <bean class="org.springframework.kafka.listener.ContainerProperties">
            <constructor-arg name="topics" value="job.step" />
        </bean>
    </constructor-arg>
</bean>


<int-kafka:message-driven-channel-adapter
    id="inboundKafkaRequests"
    send-timeout="5000"
    mode="record"
    channel="inbound-requests"
    auto-startup="true"
    listener-container="worker-container" 
    />

Prior Research:

  1. Spring Integration Kafka Consumer Listener not Receiving messages

  2. How do I convert this spring-integration configuration from XML to Java?

  3. kafka broker not available at starting


Edit: Update

During a linux patch update, the kafka config file was overwritten with a default file. I restored the correct configuration and kafka resumed operation.

The in the exercise of writing this question I formalized much of the spring wiring. The process of formalization helped to identify some issues with unwired integration components.

pojo-guy
  • 966
  • 1
  • 12
  • 39
  • I'm not familiar with Spring Batch, but you definitely need to add that tag to the question. This is really more related to Spring Batch, not Spring Integration. Also would be great to have some simple project from your to play with on our side. Looks like it is pretty complicated configuration, so would be great to have everything on one plate. Your single `inboundKafkaRequests` configuration doesn't bring too much value currently. Thank you for understanding. – Artem Bilan Apr 30 '19 at 15:46
  • 1
    You should turn on DEBUG logging to watch the message flow. You should also set `enable.auto.commit=false` and `auto.offset.reset=earliest` to the consumer properties. – Gary Russell Apr 30 '19 at 16:18
  • Thank you both. See edits to question - I'm on to the next problem now. – pojo-guy Apr 30 '19 at 18:13

0 Answers0