How to attach to multiple Kafka clusters #1980
-
We are creating an aggregator/ETL spring boot application that needs to connect to Kafka clusters hosted on AWS and an On-Prem cluster. How do we configure it properly such that error handling puts the message to a topic on the same cluster that the message originated? The AWS Kafka cluster contains 2 topics and their retry and dlt topics. The on-prem Kafka cluster contains 1 input topic with its associated retry and dlt topics. And 3 output topics the application needs to send the ETL data to. What is the proper way to create ConcurrentKafkaListenerContainerFactory and KafkaTemplate beans for the two clusters? We have tried creating our own sets of ConcurrentKafkaListnerContainerFactory, ConsumerFactory, ProducerFactory and KafkaTemplate beans for each cluster by naming the beans with a prefix "aws" or "onPrem". However, this results in a Failed to load ApplicationContext error. It was unable to create bean with name 'kafkaListenerContainerFactory' with a 'NoUniqueBeanDefinitionException' which reports there are 2 matches: 'awsConsumerFactory' and 'onPremConsumerFactory'. Any help, guidance would be appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Name one of your factories @Bean
@ConditionalOnMissingBean(name = "kafkaListenerContainerFactory")
ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory( Hence, it will try to create this factory only if there is not already a bean with that name. |
Beta Was this translation helpful? Give feedback.
Name one of your factories
kafkaListenerContainerFactory
- Boot's conditional declaration looks like this:Hence, it will try to create this factory only if there is not already a bean with that name.