My spring batch process reads millions of rows from a DB in a read step, and publishing it to a kafka topic in the write step.
In the corporate I'm working we are using a wrapper on kafka and I'm currently calling a sendAsync() method which takes in eventSupplier the message and a callback: org.apache.kafka.clients.producer.Callback
My question is since it is a batch I'm afraid that the job execution will end before I get indication on all the messages sent in async mode. ( perhaps exceptions were thrown on the last msgs)
Is there a spring batch platform supported way to ensure step completion ? or should I just implement Future.isDone() in the write step ? I saw in the spring batch documentation that there is a KafkaItemWriter should I use it in my case ? I'm currently implementing ItemWriter Thanks.
CodePudding user response:
The KafkaItemWriter provided by Spring Batch waits for the async call to return (up to a configurable timeout). This is to ensure that the write operation has succeeded or failed.
If you are writing a custom ItemWriter
, you can get inspiration from that.