I write here as I'm having problems to deserialise a Record type in Java when is inside a union of two Records and I didn't find any similar post.
I would like to ask, is anyone of you face this problem and/or can help me to find a solution??
The avro schema that we defined is with a EventWrapper like this:
{
"type": "record",
"name": "EventWrapper",
"namespace": "com.my.spec.avro.wrapper",
"fields": [{
"name": "delta",
"type": "null",
"default": null
},{
"name": "data",
"type": [
"null",
{
"type": "record",
"name": "PersonEventDto",
"namespace": "com.my.spec.avro.entity",
"fields":[
.....
]
},{
"type": "record",
"name": "CustomerEventDto",
"namespace": "com.my.spec.avro.entity",
"fields": [
.....
]
}
],
"default": null
}
]
}
The idea is to have delta and data, both of them nullable and, by the moment, we defined two records in the field of data. PersonEventDto and CustomerEventDto.
The problem is that the plugin avro-maven-plugin converts this union data to Object in java and it doesn’t keep the type anywhere.
private java.lang.Object data;
And when we want to deserialise it using this EventWrapper object generated from the avro maven plugin and we can’t get the data object with its real type. (PersonEventDto and CustomerEventDto.)
Anyone faced this problem with record and unions? Is there a way that avro or maven plugin keeps the types to have it in the serialization or deserialization?
Thanks you in advance!
CodePudding user response:
The type of the union is definitely kept. It's maintained in the schema and in the serialized data.
In your example schema, I added a long id1
field to PersonEventDto
, and id2
field to CustomerEventDto
. The following test case works on the generated code (Avro 1.11.1):
@Test
public void testBasic() throws IOException {
// both null
EventWrapper ew = EventWrapper.newBuilder().setDelta(null).setData(null).build();
EventWrapper roundTrip = EventWrapper.fromByteBuffer(ew.toByteBuffer());
assertThat(roundTrip.getDelta(), nullValue());
assertThat(roundTrip.getData(), nullValue());
// With a person
EventWrapper ewPerson = EventWrapper.newBuilder().setDelta(null).setData(
PersonEventDto.newBuilder().setId1(123L).build()).build();
roundTrip = EventWrapper.fromByteBuffer(ewPerson.toByteBuffer());
assertThat(roundTrip.getDelta(), nullValue());
assertThat(roundTrip.getData(), instanceOf(PersonEventDto.class));
assertThat(roundTrip.getData(), hasProperty("id1", is(123L)));
assertThat(((PersonEventDto)roundTrip.getData()).getId1(), is(123L));
// With a customer event
EventWrapper ewCustomer = EventWrapper.newBuilder().setDelta(null).setData(
CustomerEventDto.newBuilder().setId2(234L).build()).build();
roundTrip = EventWrapper.fromByteBuffer(ewCustomer.toByteBuffer());
assertThat(roundTrip.getDelta(), nullValue());
assertThat(roundTrip.getData(), instanceOf(CustomerEventDto.class));
assertThat(roundTrip.getData(), hasProperty("id2", is(234L)));
}
Because data
is a union of two different types, Object
is the common parent used to represent it. You can still use Java's instanceof
to distinguish between the two of them, and cast the deserialized value as appropriate.
CodePudding user response:
Solved! The problem was in the consumer side not in the avro schema!
I needed to add this configuration in the consumer’s yml:
spring.cloud.stream.kafka.binder.consumer-properties.specific.avro.reader: true
and now it converts the object to the specific type instead the LinkedHashMap!! (I'm using spring-cloud-stream-binder-kafka but I sure there is te equivalent with a regular spring kafka library)
And now the deserilizer convert correctly the data with the specific type and I can convert the Object just doing this:
CustomerEventDto customer = (CustomerEventDto) event.getPayload().getData();
Thanks Ryan Skraba!!! Your comment make me think outside the box and check the consumer and not the avro-maven-plugin!