Replay Kafka topic with Server-Sent-Events

Deadly 提交于 2021-02-11 12:35:45

问题


I'm thinking about the following use-case and would like to validate if this approach is conceptually valid.

The goal is to expose a long-running Server-Sent-Event (SSE) endpoint in Spring replaying the same Kafka topic for each incoming connection (with some user-specific filtering).

The SSE is exposed in this way:

  @GetMapping("/sse")
  public SseEmitter sse() {
    SseEmitter sseEmitter = new SseEmitter();

    Executors
          .newSingleThreadExecutor()
          .execute(() -> dummyDataProducer.generate()  // kafka ultimately
                .forEach(payload -> {
                  try {
                    sseEmitter.send(payload);
                  } catch (IOException ex) {
                    sseEmitter.completeWithError(ex);
                  }
                }));

    return sseEmitter;
  }

From the other side, there is a KafkaListener method (ConcurrentKafkaListenerContainerFactory is used) :

  @KafkaListener(topics = "${app.kafka.topic1}")
  public void receive(
        @Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) Integer id,
        @Payload Object payload) {
    // do something ...
  }

As far as I know, the Kafka consumer application uses one thread for reading data from a single topic. This somehow violates the idea of using SSE, where for each incoming connection a dedicated long-running thread is created.

Is it a valid approach for this use-case? If so, how to accomplish this properly?

来源:https://stackoverflow.com/questions/65071532/replay-kafka-topic-with-server-sent-events

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!