This is a followup to several other SO questions regarding use of spring-batch and spring-kafka.
Intent:
The intent is to set up a call chain like this (simplified view):
Master invokes slave step:
<master job> -> partitioner (MessageChannelPartitionHandler) +aggregator -> messagingTemplate -> outbound-requests (Channel) -> request-outbound-staging (KafkaProducerMessageHandler) -> kafka
Kafka listener responds to message and fires slave worker step
kafka -> inbound-request-listener (MessageDrivenChannelAdapter) -> inbound-requests (channel) -> worker-container (KafkaMessageListenerContainer) -> stepExecutionRequestHandler <slave step>
Spring batch replies are returned to Kafka
stepExecutionRequestHandler <slave step> -> stepMessagingTemplate -> outbound-replies (Channel) -> reply-outbound-staging (KafkaProducerMessageHandler) -> kafka
Kafka listener returns replies to aggregator and partitioner
kafka -> inbound-replies (MessageDrivenChannelAdapter) -> partitioner (MessageChannelPartitionHandler) +aggregator -> <master job>
History:
After initially working out configuration of spring-integration with kafka, the spring-batch components on the slave side of the process were not finding the steps in when the listener was kicked off.
We refactored the spring-batch components and the spring-integration that drive them, ultimately moving them and the listener components out of the java DSL and into the slave step XML.
Current status:
After the refactoring, the kafka listener does not appear to be responding any more. The only symptom is that there are no responses from the slave process, and the aggregator times out.
Java DSL config:
@Configuration
@Order(6)
@EnableIntegration
@EnableKafka
@IntegrationComponentScan
public class QueueingConfig {
private static final int MXMODULE = 400;
private static final String JOB_CONTROL_TOPIC = "job.control";
private static final String STEP_EXECUTION_TOPICS = "job.step";
private static final String STEP_REPLY_TOPICS = "job.step.reply";
XML config snips:
<bean id="worker-container" class="org.springframework.kafka.listener.KafkaMessageListenerContainer">
<constructor-arg>
<bean class="org.springframework.kafka.core.DefaultKafkaConsumerFactory">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="192.168.2.127:9092" /> <!-- needs to come from factory bean -->
<entry key="key.deserializer" value="org.apache.kafka.common.serialization.IntegerDeserializer"/>
<entry key="value.deserializer" value="org.springframework.kafka.support.serializer.JsonDeserializer"/>
<entry key="group.id" value="batch"/>
<entry key="spring.json.trusted.packages" value="com.mypackage,org.springframework.batch.integration.partition"/>
<entry key="max.poll.records" value="10"/>
</map>
</constructor-arg>
</bean>
</constructor-arg>
<constructor-arg>
<bean class="org.springframework.kafka.listener.ContainerProperties">
<constructor-arg name="topics" value="job.step" />
</bean>
</constructor-arg>
</bean>
<int-kafka:message-driven-channel-adapter
id="inboundKafkaRequests"
send-timeout="5000"
mode="record"
channel="inbound-requests"
auto-startup="true"
listener-container="worker-container"
/>
Prior Research:
Spring Integration Kafka Consumer Listener not Receiving messages
How do I convert this spring-integration configuration from XML to Java?
Edit: Update
During a linux patch update, the kafka config file was overwritten with a default file. I restored the correct configuration and kafka resumed operation.
The in the exercise of writing this question I formalized much of the spring wiring. The process of formalization helped to identify some issues with unwired integration components.