Spring kafka stream receives duplicate messages on restart #2804
Unanswered
rajagopalmani
asked this question in
Q&A
Replies: 1 comment
-
"Exactly Once" in Kafka means a complete However, a common misconception is that the In this case, it looks like the offsets are not being committed, for some reason; we are not Kafka Streams experts here. This has nothing to do with Spring; you should ask questions about Kafka itself to the wider Kafka community, either on Stack Overflow or one of their mailing lists. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My Spring boot application doesn't have a sink connect but instead a simple foreach where I do processing of message ( posting it to a Soap WebService in real application ).
It is a stateless stream.
enabled PROCESSING_GUARANTEE_CONFIG, "exactly_once_v2".
To keep simplicity, when there is an error, I quit application.
To keep simplicity I shall send numbers as String messages
To simulate my issue I have created an if statement in foreach loop to throw error when message number % 15 == 0
Lets say I send number 1 to 10 as String messages at first from producer. The consumer consumes it all fine.
After a while I send 11 to 20 as String messages from producer. The consumer will fail while processing number 15 due to if case, like below:
https://github.com/rajagopalmani/kafka-stream-project.git
Change the producer main class for loop i value to modify the number stream produced
The kafka docker setup I used is also attached in docker-compose.yaml of project
Beta Was this translation helpful? Give feedback.
All reactions