They removed mandatory / optional in Protobuf version 3, and this rendered it useless for Confluent Kafka schema registry at our company.
I read the explanation for this change - to be more flexible about breaking changes, and while it may make sense for some cases, we could not rely on Protobuf 3 in event driven architecture with stricter requirements to data consistency. We went with Avro.
So far so good. Avro is the longest supported serialisation format in Confluent Kafka, recently JSON and Protobuf were added. If you are on JVM stack, the drivers to work with schemas are well supported. We use Python, and Confluent driver is lagging behind if you want some advanced stuff like supporting Avro unions for multiple event types per topic approach [1] and missing auto-resolution for schemas for that scenario in Avro deserialiser. It is not difficult to implement by ourselves, but I would prefer not to do it.
I read the explanation for this change - to be more flexible about breaking changes, and while it may make sense for some cases, we could not rely on Protobuf 3 in event driven architecture with stricter requirements to data consistency. We went with Avro.