Spark throws ClassNotFoundException when using --jars option

后端 未结 2 2045
抹茶落季
抹茶落季 2020-12-25 14:48

I was trying to follow the Spark standalone application example described here https://spark.apache.org/docs/latest/quick-start.html#standalone-applications

The exam

2条回答
  •  醉酒成梦
    2020-12-25 15:24

    According to spark-submit's --help, the --jars option expects a comma-separated list of local jars to include on the driver and executor classpaths.

    I think that what's happening here is that /home/linpengt/workspace/scala-learn/spark-analysis/target/pack/lib/* is expanding into a space-separated list of jars and the second JAR in the list is being treated as the application jar.

    One solution is to use your shell to build a comma-separated list of jars; here's a quick way of doing it in bash, based on this answer on StackOverflow (see that answer for more complex approaches that handle filenames that contain spaces):

    spark-submit --jars $(echo /dir/of/jars/*.jar | tr ' ' ',') \
        --class "SimpleApp" --master local[4] path/to/myApp.jar
    

提交回复
热议问题