33,390 questions
1
vote
0
answers
2k
views
@Controller Kafka Listener exception not caught by Exception handler @ControllerAdvice
I have a Kafka Listener marked with @Controller:
@KafkaListener(topics = "${binlookup.kafka.topic.insert}")
public BinInfoJsonDTO listenAndInsert(String binInfoJson, Acknowledgment ack) {
...
11
votes
3
answers
9k
views
Creating Kafka topic in sarama
Is it possible to create kafka topic in sarama?
I know java API enables you do create topic but I couldn't find any information on how to do that in sarama.
if it's possible, an example or explanation ...
1
vote
2
answers
13k
views
Failed to resolve 'kafka:9092': Temporary failure in name resolution
I faced with some issue
%3|1622567567.487|FAIL|rdkafka#consumer-2| [thrd:GroupCoordinator]: GroupCoordinator: kafka:9092: Failed to resolve 'kafka:9092': Temporary failure in name resolution (after ...
3
votes
2
answers
8k
views
SpringBoot, Kafka : java.lang.NoSuchMethodError: org.apache.kafka.clients.producer.Producer.close(Ljava/time/Duration;)V [duplicate]
I'm using spring boot v2.2.4 and Apache Kafka in my project.
Below is my pom.xml file:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
...
0
votes
0
answers
50
views
Kafka Streams 4.x Java 11 Compatibility
The Kafka documentation states that Kafka 4.x supports Java 11 for clients, including Kafka Streams, but not for brokers, which require Java 17.
I'm running a Kafka Streams Scala 4.x consumer. ...
17
votes
5
answers
22k
views
Is there any simulator/tool to generate messages for streaming?
For testing purpose, I need to simulate client for generating 100,000 messages per second and send them to kafka topic. Is there any tool or way that can help me generate these random messages?
5
votes
2
answers
16k
views
How do I stream a video file using Kafka?
I'm trying to send multiple .mp4 files as kafka stream messages.
I tried to follow the same approach as for text messages, but it did not work out.
Does it mean I need a special Encoder/Decoder/...
0
votes
1
answer
58
views
Spring Kafka consumer stops consuming after 1–2 days with ExpiringCredentialRefreshingLogin logs
I have a Spring Kafka application with a single consumer. The Kafka client authentication is configured using SASL/OAUTHBEARER over SSL, as shown below:
authProps.put(SaslConfigs.SASL_MECHANISM,...
10
votes
1
answer
11k
views
confluent kafka producer KafkaError{code=_MSG_TIMED_OUT,val=-192,str="Local: Message timed out"}
I'm new to Kafka, using confluent kafka and trying to write messages to existing kafka topic from AWS EC2 instance using python producer code with 'sasl.mechanism': 'PLAIN','security.protocol': '...
1
vote
0
answers
74
views
How to configure ACLs with SASL_SSL OAUTHBEARER in Apache Kafka (KRaft mode, multi-node setup)?
I am trying to secure an Apache Kafka cluster running in KRaft mode using SASL_SSL with OAUTHBEARER authentication and enforce ACLs.
I have a multi-node setup - controller and broker run on different ...
1
vote
1
answer
81
views
Apache Camel not propagating traceparent in Open Telemetry
I have a Camel, spring boot app that reads a message from a kafka topic. I have a spring boot test that places a message on an embedded kafka broker and add an open teleletry 'traceparent' header, ...
1
vote
2
answers
10k
views
Kafka consumer-client is not registering offset of consumer group on zookeeper
I'm trying to create multiple consumers with different consumer groups to a kafka topic using kafka-clients v.0.10.2.1. Although I'm not able to retrieve the last offset commited by a consumer group.
...
0
votes
1
answer
62
views
Kafka broker down because all log dirs have failed [closed]
Running Confluent Kafka 7.9.0 on Ubuntu kernel 6.8.0 in AWS EC2. Zookeeper is already up.
$ sudo systemctl restart confluent-server
$ sudo journalctl -u confluent-server.service -e
$ ... removing ...
-2
votes
0
answers
50
views
Kafka Connect Refuse to Flink SQL [closed]
I want to process data through flink SQL, here is my code
CREATE TABLE ohlc_source (
screener STRING,
symbol STRING,
`open` DOUBLE,
`high` DOUBLE,
`low` DOUBLE,
`close` DOUBLE,
`volume` ...
93
votes
4
answers
66k
views
Why is Kafka pull-based instead of push-based?
Why is Kafka pull-based instead of push-based? I agree Kafka gives high throughput as I had experienced it, but I don't see how Kafka throughput would go down if it were to pushed based. Any ideas on ...