Home > database >  kafka streams in golang
kafka streams in golang

Time:12-14

I'm trying to use golang to create a kafka stream client in Go. From what I have seen this is only possible if using a Java Client. I did a bit of searching and found a few other third party libraries but nothing official. Also from my limited understanding I think streams is syntactic sugar over the standard consumers ? is this correct ?

CodePudding user response:

To answer your this particular question,

Also from my limited understanding I think streams is syntactic sugar over the standard consumers ? is this correct?

When implementing asynchronous microservices, we could use producer and consumer APIs but these APIs are too low level, they were good to understand how to use Kafka, but if we want to implement more complex applications, they might be too low level. Also, when we develop event‑driven applications, we might need to implement multistage processing when we would connect multiple stages. At each stage, we will read events from Kafka, process them, and write to output topics. Again, using producer and consumer APIs would be quite a lot of work. And one of the more high‑level solutions we can use is called Kafka Streams. Kafka Streams is a very versatile library, it supports stateless stream processing and it also supports stateful processing.

Note: And if you have the option of working with a language other than Go, I would highly recommend working with Java for Kafka Streams. Just to mention here we have been working with Kafka Streams using Java for the last two years and we have felt Kafka Streams is more of a Java library than a distributed system.

CodePudding user response:

To write a Kafka stream client in Go, You will need standard consumers as well as producers. Consumers to source data from Kafka and Producers to sink data to Kafka topics. There are multiple streaming concepts to implement to process consumed messages from Kafka such as Process, Filter, Transform.

And in Kafka stream concepts, there are state stores. So you will need In-Memory Store backend implementations as well as pluggable store backends as BatcherDB, MongoDB, RocksDB.

In addition to that, there should be a mechanism for failover handling. Some recovery implementation.

Note - As a Go developer for the last 4 years I have used the Go Kafka stream library tryfix/kstream for production-level implementations and contributed to library improvements and fixing issues. You can get a good understanding by going through the repository.

  • Related