What is use of method addJar() in Spark?

做~自己de王妃 提交于 2019-12-01 23:35:35

Did you try set the path of jar with prefix "local"? From documentation:

public void addJar(String path)

Adds a JAR dependency for all tasks to be executed on this SparkContext in the future. The path passed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), an HTTP, HTTPS or FTP URI, or local:/path for a file on every worker node.

You can try this way as well:

val conf = new SparkConf()
             .setMaster('local[*]')
             .setAppName('tmp')
             .setJars(Array('/path1/one.jar', '/path2/two.jar'))

val sc = new SparkContext(conf)

and take a look here, check spark.jars option

and set "--jars" param in spark-submit:

--jars /path/1.jar,/path/2.jar

or edit conf/spark-defaults.conf:

spark.driver.extraClassPath /path/1.jar:/fullpath/2.jar
spark.executor.extraClassPath /path/1.jar:/fullpath/2.jar
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!