Passing additional jars to Spark via spark-submit

后端 未结 2 931
甜味超标
甜味超标 2020-12-04 03:58

I\'m using Spark with MongoDB, and consequently rely on the mongo-hadoop drivers. I got things working thanks to input on my original question here.

My

2条回答
  •  不知归路
    2020-12-04 04:14

    The problem is that CLASSPATH should be colon separated, while JARS should be comma separated:

    $SPARK_HOME/bin/spark-submit \
    --driver-class-path /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar \
    --jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar,/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py
    

提交回复
热议问题