Yeah, enriching the events as they pass through - so we have a Kafka Connect JDBC source polling the source of truth DB every X seconds for model changes (we looked into Debezium for streaming model changes, but the additional complexity didn't bring us much benefit based on how often our model changes), and then we pull that into a GlobalKTable and join the KStream as appropriate.
We also use Kafka streaming with a persisted event window store to deduplicate across a window of 2 hours.
Mind if I ask you what you used to capture source changes if Debezium didn't work out for you? Is there a specific Kafka Connector for JDBC that you're using?
If we had a need for near instantaenous model updates, then I'd definitely go Debezium. We didn't use it because we didn't need it - a generic KC JDBC source with a poll period of 5 seconds met our needs.s
We also use Kafka streaming with a persisted event window store to deduplicate across a window of 2 hours.