Hacker News new | past | comments | ask | show | jobs | submit login

They removed mandatory / optional in Protobuf version 3, and this rendered it useless for Confluent Kafka schema registry at our company.

I read the explanation for this change - to be more flexible about breaking changes, and while it may make sense for some cases, we could not rely on Protobuf 3 in event driven architecture with stricter requirements to data consistency. We went with Avro.




I've been looking at Avro for a while; how are you all feeling about it? Any suggestions or gotcha from real world usage?


So far so good. Avro is the longest supported serialisation format in Confluent Kafka, recently JSON and Protobuf were added. If you are on JVM stack, the drivers to work with schemas are well supported. We use Python, and Confluent driver is lagging behind if you want some advanced stuff like supporting Avro unions for multiple event types per topic approach [1] and missing auto-resolution for schemas for that scenario in Avro deserialiser. It is not difficult to implement by ourselves, but I would prefer not to do it.

[1] https://www.confluent.io/blog/multiple-event-types-in-the-sa...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: