Background PostgreSQL introduced a feature called logical decoding in version 9.4. This poorly-named feature allows us to do some very cool things like build a stream of events corresponding to changes to the table. Imagine something like this: You have your ORM layer that connects to the DB and inserts, updates, deletes rows in a DB transaction. Now imagine you are able to run an event processor that receives these changes once the DB transaction is committed i.e. the changes are final. You’ll receive the complete row data including old and new value in case of an update to an existing row.
HI Siva, great article! CDC is one of the core patterns these days in async event based systems.
We use Mongo with confluent kakfa in production for something similar, both of which provide this integration seamlessly with Mongo changestreams and Kafka connectors. I am wondering why you chose RabbitMQ as opposed to kafka for this?
Stream PostgreSQL CDC events via PGEvents
HI Siva, great article! CDC is one of the core patterns these days in async event based systems.
We use Mongo with confluent kakfa in production for something similar, both of which provide this integration seamlessly with Mongo changestreams and Kafka connectors. I am wondering why you chose RabbitMQ as opposed to kafka for this?