Home > Enterprise >  How to configure Kafka environment from different frameworks of Js and Go to ship JSON
How to configure Kafka environment from different frameworks of Js and Go to ship JSON

Time:09-15

I am building a couple of micro-services and a logging micro-service along with it.

The logging micro-service code will come after the other micro-services have sent their logs to kafka. Logging as a micro-service has to consume all the log data which is sent from JS-based micro-services and I have to receive the JSON in GO.
Is there any other way without using a parser? (Just like g-RPC changes the whole data to binary for faster transportation and is understandable for every environment.)

I have very less knowledge about how the different environments work together when using a message broker.

CodePudding user response:

You can use schemas (Protobuf, Avro, JSON Schema), so it will be converted to binary, although to utilize its full benefits, you will need a schema registry otherwise, it will embed schema to every message.

CodePudding user response:

I'm not sure I fully understand what you're asking.

Kafka stores bytes, and there's clients in both JS or Golang. Both languages/environments need to have available libraries for serialization and deserialization. Regarding golang, you don't need a struct to read JSON.

Alternatively, you can use tools like Elasticsearch or Splunk to consume any Kafka event and index the fields after deserializing those records without needing any Go consumer.

But yes, a parser will be needed somewhere to index your data into a searchable format.

  • Related