Automatically including jars to PySpark classpath

后端 未结 3 1599
一生所求
一生所求 2020-12-14 11:07

I\'m trying to automatically include jars to my PySpark classpath. Right now I can type the following command and it works:

$ pyspark --jars /path/to/my.jar
         


        
3条回答
  •  南笙
    南笙 (楼主)
    2020-12-14 11:29

    You can add the jar files in the spark-defaults.conf file (located in the conf folder of your spark installation). If there is more than one entry in the jars list, use : as separator.

    spark.driver.extraClassPath /path/to/my.jar
    

    This property is documented in https://spark.apache.org/docs/1.3.1/configuration.html#runtime-environment

提交回复
热议问题