I am developing a generic Spark application that listens to a Kafka stream using Spark and Java.
I am using kafka_2.11-0.10.2.2, spark-2.3.2-bin-hadoop2.7 - I also t
You need to use the Maven Shade Plugin to package the Kafka clients along with your Spark application, then you can submit the shaded Jar, and the Kafka serializers should be found on the classpath.
Also, make sure you set the provided Spark packages
org.apache.spark
spark-core_${spark.scala.version}
${spark.version}
provided
org.apache.spark
spark-streaming_${spark.scala.version}
${spark.version}
provided