Apache Flink - Partitioning the stream equally as the input Kafka topic
问题 I would like to implement in Apache Flink the following scenario: Given a Kafka topic having 4 partitions, I would like to process the intra-partition data independently in Flink using different logics, depending on the event's type. In particular, suppose the input Kafka topic contains the events depicted in the previous images. Each event have a different structure: partition 1 has the field " a " as key, partition 2 has the field " b " as key, etc. In Flink I would like to apply different