Spark throws ClassNotFoundException when using --jars option

后端 未结 2 2046
抹茶落季
抹茶落季 2020-12-25 14:48

I was trying to follow the Spark standalone application example described here https://spark.apache.org/docs/latest/quick-start.html#standalone-applications

The exam

2条回答
  •  滥情空心
    2020-12-25 15:19

    Is your SimpleApp class in any specific package? It seems that you need to include the full package name in the command line. So, if the SimpleApp class is located in com.yourcompany.yourpackage, you'd have to submit the Spark job with --class "com.yourcompany.yourpackage.SimpleApp" instead of --class "SimpleApp". I had the same problem and changing the name to the full package and class name fixed it. Hope that helps!

提交回复
热议问题