Spark-submit ClassNotFound exception

前端 未结 5 933
我寻月下人不归
我寻月下人不归 2020-12-29 06:01

I\'m having problems with a \"ClassNotFound\" Exception using this simple example:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext.         


        
5条回答
  •  Happy的楠姐
    2020-12-29 06:11

    I had this same issue. If master is local then program runs fine for most people. If they set it to (also happened to me) "spark://myurl:7077" it doesn't work. Most people get error because an anonymous class was not found during execution. It is resolved by using SparkContext.addJars ("Path to jar").

    Make sure you are doing the following things: -

    • SparkContext.addJars("Path to jar created from maven [hint: mvn package]").
    • I have used SparkConf.setMaster("spark://myurl:7077") in code and have supplied same as argument while submitting job to spark via command line.
    • When you specify class in command line, make sure your are writing it's complete name with URL. eg: "packageName.ClassName"
    • Final command should look like this bin/spark-submit --class "packageName.ClassName" --master spark://myurl:7077 pathToYourJar/target/yourJarFromMaven.jar

    Note: this jar pathToYourJar/target/yourJarFromMaven.jar in last point is also set in code as in first point of this answer.

提交回复
热议问题