Skip to content

Latest commit

 

History

History
56 lines (34 loc) · 1.84 KB

File metadata and controls

56 lines (34 loc) · 1.84 KB

Kafka

Primary PostgreSQL Data Source

Through Debezium this Source connector collects the logical replication CDC) where in this particular we made usage of WAL.

Secondary Elasticsearch Data Source

Once, we have the data out backed into Kafka, we now can make the data movement in a fashion and frictionless manner.

Through the oficial Confluent's Elasticsearch sink the data is being replicated from the source topic directly to the secondary data-source.

High level overview on how the replication works in a wide E2E view.

image

Setup the Environment

Assuming you already ran the docker-compose.

make setup

Make sure the connectors are running

make connectors-status

Users

Run the users-consumer to follow the messages being sent by the Source Connector to the "source" topic.

make users-consumer

In another terminal window, produce some changes on the Source Database by calling the GraphQL API to insert/update users:

NAME=John make users-create-mutation

You should be able to see new messages appering on the consumer

Make sure the data is being sent to Elasticsearch as well.

make users-search

Lenses Dashboard

You can access the Lenses Dashboard to follow in a visual manner the Kafka components including the connectors by accessing fast-data-dev.