Home > database >  Spring micro-services: Kafka event processing issue
Spring micro-services: Kafka event processing issue

Time:11-14

I'm new to micro-services and need some suggestion on how to address the below issue.

I have two micro-services order micro-service and delivery micro-services. The order micro-service pushes the event to delivery micro-service through Kafka.

I need some assistance in

  1. How do I track a particular event in both micro-services? This will help in logging and tracking the event changes etc.

    For this can I generate a random number and add it to the payload which can be used for tracking.

  2. Let's assume that order micro-service has 10 orders and all the ten have been processed and 10 events are generated and are pushed to the Kafka? If there is a failure, how should I handle it?

    I thought of creating an error queue for order service and if there are any errors while processing the events from the order micro-service side it can be pushed there. The user can correct the issue and re-try only those events.

  3. At the delivery micro-service how do I ensure that the a particular event is not processed more than once?

  4. Also, if there any errors while processing the events from the delivery micro-service side, how should this be handled?

    Same as second point?

Are there any more scenarios which I need to consider?

CodePudding user response:

  1. How do I track a particular event in both micro-services. This will help in logging and tracking the event changes etc.

For that, usually what is used in prod environments is implementing a traceId.

  1. Let us assume that Order micro-service has 10 orders and all the ten have been processed and 10 events are generated and are pushed to the Kafka? If there is a failure how should I handle it?

Generally what is implemented is a retry template (if you are using spring-boot, otherwise something similar in whichever framework you are working with). What this does is that it retries for a set number of times, handling any possible transitive errors. If the error still persists there are multiple possibilities. You could log appropriately and reprocess from that given offset, or in a less critical environment you could save the event as a file and build an Admin RESTful Endpoint which takes the event and processes it in the same way you process the kafka events.

  1. At the delivery micro-service how do I ensure that the a particular event is not processed more than once.

This depends on the logic you perform per kafka message. If you are creating new data (inserting rows) in your db then you could simply check for uniqueness based on primary key (which in this case would be using order-id or smth similar) and refuse to add the new data, by logging it with a warn message. Otherwise, a common way to do this would be to overwrite the previously inserted data. Both ways should be as performant as processing always unique records if you leave the uniqueness to the db and properly handle the exception.

Another workaround towards uniqueness includes editing the producer to send only unique messages. Check this out.

  1. Also, if there any errors while processing the events from the delivery micro-service side, how should this be handled?

See no 2.

  • Related