Through Debezium this Source connector collects the logical replication CDC) where in this particular we made usage of WAL.
Once, we have the data out backed into Kafka, we now can make the data movement in a fashion and frictionless manner.
Through the oficial Confluent's Elasticsearch sink the data is being replicated from the source topic directly to the secondary data-source.
High level overview on how the replication works in a wide E2E view.
Assuming you already ran the docker-compose
.
make setup
Make sure the connectors are running
make connectors-status
Run the users-consumer to follow the messages being sent by the Source Connector to the "source" topic.
make users-consumer
In another terminal window, produce some changes on the Source Database by calling the GraphQL API to insert/update users:
NAME=John make users-create-mutation
You should be able to see new messages appering on the consumer
Make sure the data is being sent to Elasticsearch as well.
make users-search
You can access the Lenses Dashboard to follow in a visual manner the Kafka components including the connectors by accessing fast-data-dev.