How to use Flink's KafkaSource in Scala?

前端 未结 3 1215
悲&欢浪女
悲&欢浪女 2021-01-19 12:24

I\'m trying to run a simple test program with Flink\'s KafkaSource. I\'m using the following:

  • Flink 0.9
  • Scala 2.10.4
  • Kafka 0.8.2.1
3条回答
  •  予麋鹿
    予麋鹿 (楼主)
    2021-01-19 12:44

    I'm a sbt user so I used the following build.sbt:

    organization := "pl.japila.kafka"
    scalaVersion := "2.11.7"
    
    libraryDependencies += "org.apache.flink" % "flink-connector-kafka" % "0.9.0" exclude("org.apache.kafka", "kafka_${scala.binary.version}")
    libraryDependencies += "org.apache.kafka" %% "kafka" % "0.8.2.1"
    

    that allowed me to run the program:

    import org.apache.flink.streaming.api.environment._
    import org.apache.flink.streaming.connectors.kafka
    import org.apache.flink.streaming.connectors.kafka.api._
    import org.apache.flink.streaming.util.serialization._
    
    object TestKafka {
      def main(args: Array[String]) {
        val env = StreamExecutionEnvironment.getExecutionEnvironment
        val stream = env
         .addSource(new KafkaSource[String]("localhost:2181", "test", new SimpleStringSchema))
         .print
      }
    }
    

    The output:

    [kafka-flink]> run
    [info] Running TestKafka
    log4j:WARN No appenders could be found for logger (org.apache.flink.streaming.api.graph.StreamGraph).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
    [success] Total time: 0 s, completed Jul 15, 2015 9:29:31 AM
    

提交回复
热议问题