Automatically including jars to PySpark classpath

后端 未结 3 1598
一生所求
一生所求 2020-12-14 11:07

I\'m trying to automatically include jars to my PySpark classpath. Right now I can type the following command and it works:

$ pyspark --jars /path/to/my.jar
         


        
3条回答
  •  星月不相逢
    2020-12-14 11:50

    As far as I know, you have to import jars to both driver AND executor. So, you need to edit conf/spark-defaults.conf adding both lines below.

    spark.driver.extraClassPath /path/to/my.jar
    spark.executor.extraClassPath /path/to/my.jar
    

    When I went through this, I did not need any other parameters. I guess you will not need them too.

提交回复
热议问题