Automatically including jars to PySpark classpath

后端 未结 3 1601
一生所求
一生所求 2020-12-14 11:07

I\'m trying to automatically include jars to my PySpark classpath. Right now I can type the following command and it works:

$ pyspark --jars /path/to/my.jar
         


        
3条回答
  •  不知归路
    2020-12-14 11:31

    Recommended way since Spark 2.0+ is to use spark.driver.extraLibraryPath and spark.executor.extraLibraryPath

    https://spark.apache.org/docs/2.4.3/configuration.html#runtime-environment

    ps. spark.driver.extraClassPath and spark.executor.extraClassPath are still there, but deprecated and will be removed in a future release of Spark.

提交回复
热议问题