Why does Spark application fail with “ClassNotFoundException: Failed to find data source: kafka” as uber-jar with sbt assembly?
问题 I'm trying to run a sample like https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/sql/streaming/StructuredKafkaWordCount.scala. I started with the Spark Structured Streaming Programming guide at http://spark.apache.org/docs/latest/structured-streaming-programming-guide.html. My code is package io.boontadata.spark.job1 import org.apache.spark.sql.SparkSession object DirectKafkaAggregateEvents { val FIELD_MESSAGE_ID = 0 val FIELD_DEVICE_ID = 1 val