Skip to content

Commit

Permalink
Fix typos in docs and classes
Browse files Browse the repository at this point in the history
  • Loading branch information
bky373 committed Apr 30, 2024
1 parent e547d0a commit 06d8c71
Show file tree
Hide file tree
Showing 10 changed files with 18 additions and 18 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ When using a POJO batch listener (e.g. `List<Thing>`), and you don't have the fu
----
@KafkaListener(id = "recovering", topics = "someTopic")
public void listen(List<Thing> things) {
for (int i = 0; i < records.size(); i++) {
for (int i = 0; i < things.size(); i++) {
try {
process(things.get(i));
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ The `ConsumerRetryAuthEvent` event has the following properties:
** `AUTHENTICATION` - the event was published because of an authentication exception.
** `AUTHORIZATION` - the event was published because of an authorization exception.

The `ConsumerStartingEvent`, `ConsumerStartingEvent`, `ConsumerFailedToStartEvent`, `ConsumerStoppedEvent`, `ConsumerRetryAuthSuccessfulEvent` and `ContainerStoppedEvent` events have the following properties:
The `ConsumerStartingEvent`, `ConsumerStartedEvent`, `ConsumerFailedToStartEvent`, `ConsumerStoppedEvent`, `ConsumerRetryAuthSuccessfulEvent` and `ContainerStoppedEvent` events have the following properties:

* `source`: The listener container instance that published the event.
* `container`: The listener container or the parent listener container, if the source container is a child.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ public class KafkaConfig {
@Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
...
return props;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ future.whenComplete((result, ex) -> {
`SendResult` has two properties, a `ProducerRecord` and `RecordMetadata`.
See the Kafka API documentation for information about those objects.

The `Throwable` can be cast to a `KafkaProducerException`; its `failedProducerRecord` property contains the failed record.
The `Throwable` can be cast to a `KafkaProducerException`; its `producerRecord` property contains the failed record.

If you wish to block the sending thread to await the result, you can invoke the future's `get()` method; using the method with a timeout is recommended.
If you have set a `linger.ms`, you may wish to invoke `flush()` before waiting or, for convenience, the template has a constructor with an `autoFlush` parameter that causes the template to `flush()` on each send.
Expand Down Expand Up @@ -216,7 +216,7 @@ public void sendToKafka(final MyOutputData data) {
----
====

Note that the cause of the `ExecutionException` is `KafkaProducerException` with the `failedProducerRecord` property.
Note that the cause of the `ExecutionException` is `KafkaProducerException` with the `producerRecord` property.

[[routing-template]]
== Using `RoutingKafkaTemplate`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -463,7 +463,7 @@ The `SmartMessageConverter.toMessage()` method is called to create a new outboun
Similarly, in the `KafkaMessageConverter.toMessage()` method, after the converter has created a new `Message<?>` from the `ConsumerRecord`, the `SmartMessageConverter.fromMessage()` method is called and then the final inbound message is created with the newly converted payload.
In either case, if the `SmartMessageConverter` returns `null`, the original message is used.

When the default converter is used in the `KafkaTemplate` and listener container factory, you configure the `SmartMessageConverter` by calling `setMessagingConverter()` on the template and via the `contentMessageConverter` property on `@KafkaListener` methods.
When the default converter is used in the `KafkaTemplate` and listener container factory, you configure the `SmartMessageConverter` by calling `setMessagingConverter()` on the template and via the `contentTypeConverter` property on `@KafkaListener` methods.

Examples:

Expand Down Expand Up @@ -664,7 +664,7 @@ Consider using a `DelegatingByTypeSerializer` configured to use a `ByteArraySeri
Starting with version 3.1, you can add a `Validator` to the `ErrorHandlingDeserializer`.
If the delegate `Deserializer` successfully deserializes the object, but that object fails validation, an exception is thrown similar to a deserialization exception occurring.
This allows the original raw data to be passed to the error handler.
WHen creating the deserializer yourself, simply call `setValidator`; if you configure the serializer using properties, set the consumer configuration property `ErrorHandlingDeserializer.VALIDATOR_CLASS` to the class or fully qualified class name for your `Validator`.
When creating the deserializer yourself, simply call `setValidator`; if you configure the serializer using properties, set the consumer configuration property `ErrorHandlingDeserializer.VALIDATOR_CLASS` to the class or fully qualified class name for your `Validator`.
When using Spring Boot, this property name is `spring.kafka.consumer.properties.spring.deserializer.validator.class`.

[[payload-conversion-with-batch]]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ public void processMessage(MyPojo message) {
public RetryTopicConfiguration myRetryTopic(KafkaTemplate<String, MyPojo> template) {
return RetryTopicConfigurationBuilder
.newInstance()
.fixedBackoff(3_000)
.fixedBackOff(3_000)
.maxAttempts(4)
.create(template);
}
Expand All @@ -47,7 +47,7 @@ You can also provide a custom implementation of Spring Retry's `SleepingBackOffP
public RetryTopicConfiguration myRetryTopic(KafkaTemplate<String, MyPojo> template) {
return RetryTopicConfigurationBuilder
.newInstance()
.customBackOff(new MyCustomBackOffPolicy())
.customBackoff(new MyCustomBackOffPolicy())
.maxAttempts(5)
.create(template);
}
Expand Down Expand Up @@ -81,7 +81,7 @@ public void processMessage(MyPojo message) {
public RetryTopicConfiguration myRetryTopic(KafkaTemplate<String, MyPojo> template) {
return RetryTopicConfigurationBuilder
.newInstance()
.fixedBackoff(2_000)
.fixedBackOff(2_000)
.timeoutAfter(5_000)
.create(template);
}
Expand Down Expand Up @@ -212,7 +212,7 @@ If your broker version is earlier than 2.4, you will need to set an explicit val
[[retry-headers]]
== Failure Header Management

When considering how to manage failure headers (original headers and exception headers), the framework delegates to the `DeadLetterPublishingRecover` to decide whether to append or replace the headers.
When considering how to manage failure headers (original headers and exception headers), the framework delegates to the `DeadLetterPublishingRecoverer` to decide whether to append or replace the headers.

By default, it explicitly sets `appendOriginalHeaders` to `false` and leaves `stripPreviousExceptionHeaders` to the default used by the `DeadLetterPublishingRecover`.

Expand All @@ -221,7 +221,7 @@ This is to avoid creation of excessively large messages (due to the stack trace

See xref:kafka/annotation-error-handling.adoc#dlpr-headers[Managing Dead Letter Record Headers] for more information.

To reconfigure the framework to use different settings for these properties, configure a `DeadLetterPublishingRecoverer` customizer by overriding the `configureCustomizers` method in a `@Configuration` class that extends `RetryTopicConfigurationSupport`.
To reconfigure the framework to use different settings for these properties, configure a `DeadLetterPublishingRecovererer` customizer by overriding the `configureCustomizers` method in a `@Configuration` class that extends `RetryTopicConfigurationSupport`.
See xref:retrytopic/retry-config.adoc#retry-topic-global-settings[Configuring Global Settings and Features] for more details.

[source, java]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ public void processMessage(MyPojo message) {
public RetryTopicConfiguration myRetryTopic(KafkaTemplate<String, MyPojo> template) {
return RetryTopicConfigurationBuilder
.newInstance()
.fixedBackoff(3_000)
.fixedBackOff(3_000)
.maxAttempts(5)
.useSingleTopicForFixedDelays()
.create(template);
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
* Copyright 2018-2023 the original author or authors.
* Copyright 2018-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
Expand Down Expand Up @@ -408,7 +408,7 @@ public RequestReplyFuture<K, V, R> sendAndReceive(ProducerRecord<K, V> record) {

@Override
public RequestReplyFuture<K, V, R> sendAndReceive(ProducerRecord<K, V> record, @Nullable Duration replyTimeout) {
Assert.state(this.running, "Template has not been start()ed"); // NOSONAR (sync)
Assert.state(this.running, "Template has not been started"); // NOSONAR (sync)
Duration timeout = replyTimeout;
if (timeout == null) {
timeout = this.defaultReplyTimeout;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ public class BackOffValuesGenerator {
public BackOffValuesGenerator(int providedMaxAttempts, BackOffPolicy providedBackOffPolicy) {
this.numberOfValuesToCreate = getMaxAttempts(providedMaxAttempts) - 1;
BackOffPolicy policy = providedBackOffPolicy != null ? providedBackOffPolicy : DEFAULT_BACKOFF_POLICY;
checkBackOffPolicyTipe(policy);
checkBackOffPolicyType(policy);
this.backOffPolicy = policy;
}

Expand All @@ -69,7 +69,7 @@ public List<Long> generateValues() {
: generateFromSleepingBackOffPolicy(this.numberOfValuesToCreate, this.backOffPolicy);
}

private void checkBackOffPolicyTipe(BackOffPolicy providedBackOffPolicy) {
private void checkBackOffPolicyType(BackOffPolicy providedBackOffPolicy) {
if (!(SleepingBackOffPolicy.class.isAssignableFrom(providedBackOffPolicy.getClass())
|| NoBackOffPolicy.class.isAssignableFrom(providedBackOffPolicy.getClass()))) {
throw new IllegalArgumentException("Either a SleepingBackOffPolicy or a NoBackOffPolicy must be provided. " +
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@
* public RetryTopicConfiguration myRetryableTopic(KafkaTemplate&lt;String, MyPojo&gt; template) {
* return RetryTopicConfigurationBuilder
* .newInstance()
* .fixedBackoff(3000)
* .fixedBackOff(3000)
* .maxAttempts(5)
* .includeTopics("my-topic", "my-other-topic")
* .create(template);
Expand Down

0 comments on commit 06d8c71

Please sign in to comment.