kafka-consumer-api

How @SentTo sends the message to related topic?

瘦欲@ 提交于 2019-12-11 07:28:15
问题 I am using ReplyingKafkaTemplate in my Rest controller to return the synchronous response. I am also setting header REPLY_TOPIC. For listener microservice part, @KafkaListener(topics = "${kafka.topic.request-topic}") @SendTo public Model listen(Model<SumModel,SumResp> request) throws InterruptedException { SumModel model = request.getRequest(); int sum = model.getNumber1() + model.getNumber2(); SumResp resp = new SumResp(sum); request.setReply(resp); request.setAdditionalProperty("sum", sum);

KafkaConsumer is not safe for multi-threaded access from SparkStreaming

别说谁变了你拦得住时间么 提交于 2019-12-11 07:17:20
问题 I have set up multiple streams reading from different kafka topics: for(topic <- topics) { val stream = KafkaUtils.createDirectStream[String, String]( ssc, PreferConsistent, Subscribe[String, String]( Array(topicConfig.srcTopic), kafkaParameters() ) ) stream.map(...).reduce(...)... } kafkaParameters basically has the needed config: "bootstrap.servers" -> bootstrapServers, "key.deserializer" -> classOf[StringDeserializer], "value.deserializer" -> classOf[StringDeserializer], "group.id" ->

Kafka Consumer - topic(s) with higher priority

本秂侑毒 提交于 2019-12-11 07:14:27
问题 I am using Kafka Consumer to read from several topics and I need one of those to have higher priority. The processing takes a lot of time and there are always many messages in (low priority) topics, but I need the messages from other one to be processed as soon as possible. It's similar question as Does Kafka support priority for topic or message? but this one is using old API. In new API (0.10.1.1), there are methods KafkaConsumer::pause(Collection) KafkaConsumer::resume(Collection) But it's

Is Kafka suitable for running a public API?

旧城冷巷雨未停 提交于 2019-12-11 06:57:30
问题 I have an event stream that I want to publish. It's partitioned into topics, continually updates, will need to scale horizontally (and not having a SPOF is nice), and may require replaying old events in certain circumstances. All the features that seem to match Kafka's capabilities. I want to publish this to the world through a public API that anyone can connect to and get events. Is Kafka a suitable technology for exposing as a public API? I've read the Documentation page, but not gone any

Implement filering for kafka messages

别来无恙 提交于 2019-12-11 06:56:42
问题 I have started using Kafka recently and evaluating Kafka for few use cases. If we wanted to provide the capability for filtering messages for consumers (subscribers) based on message content, what is best approach for doing this? Say a topic named "Trades" is exposed by producer which has different trades details such as market name, creation date, price etc. Some consumers are interested in trades for a specific markets and others are interested in trades after certain date etc. (content

Spring Kafka: Poll for new messages instead of being notified using `onMessage`

£可爱£侵袭症+ 提交于 2019-12-11 06:39:39
问题 I am using Spring Kafka in my project as it seemed a natural choice in a Spring based project to consume Kafka messages. To consume messages, I can make use of the MessageListener interface. Spring Kafka internally takes care to invoke my onMessage method for each new message. However, in my setting I prefer to explicitly poll for new messages and work on them sequentially (which will take a few seconds). As a workaround, I might just block inside my onMessage implementation, or buffer the

HPA using Kafka Exporter in on premise Kubernetes cluster

喜欢而已 提交于 2019-12-11 05:54:22
问题 I had been trying to implement Kubernetes HPA using Metrics from Kafka-exporter. Hpa supports Prometheus, so we tried writing the metrics to prometheus instance. From there, we are unclear on the steps to do. Is there an article where it will explain in details ? I followed https://medium.com/google-cloud/kubernetes-hpa-autoscaling-with-kafka-metrics-88a671497f07 for same in GCP and we used stack driver, and the implementation worked like a charm. But, we are struggling in on-premise setup,

what is best practice to consume messages from multiple kafka topics?

被刻印的时光 ゝ 提交于 2019-12-11 05:47:16
问题 I need to consumer messages from different kafka topics, Should i create different consumer instance per topic and then start a new processing thread as per the number of partition. or I should subscribe all topics from a single consumer instance and the should start different processing threads Thanks & regards, Megha 回答1: The only rule is that you have to account for what Kafka does and doesn't not guarantee: Kafka only guarantees message order for a single topic/partition. edit: this also

Strimzi - Connecting external clients

眉间皱痕 提交于 2019-12-11 05:40:50
问题 Following on discussion here, I used the following steps to enable an external client (based on kafkajs) connect to Strimzi on OpenShift. These steps are from here. Enable external route The kafka-persistent-single.yaml is edited to as shown below. apiVersion: kafka.strimzi.io/v1beta1 kind: Kafka metadata: name: my-cluster spec: kafka: version: 2.3.0 replicas: 1 listeners: plain: {} tls: {} external: type: route config: offsets.topic.replication.factor: 1 transaction.state.log.replication

How to define multiple serializers in kafka?

柔情痞子 提交于 2019-12-11 05:34:43
问题 Say, I publish and consume different type of java objects.For each I have to define own serializer implementations. How can we provide all implementations in the kafka consumer/producer properties file under the "serializer.class" property? 回答1: We have a similar setup with different objects in different topics, but always the same object type in one topic. We use the ByteArrayDeserializer that comes with the Java API 0.9.0.1, which means or message consumers get only ever a byte[] as the