In spark job, I don\'t know how to import and use the jars that is shared by method SparkContext.addJar(). It seems that this method is able to move jars into some
Did you try set the path of jar with prefix "local"? From documentation:
public void addJar(String path)
Adds a JAR dependency for all tasks to be executed on this SparkContext in the future. The path passed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), an HTTP, HTTPS or FTP URI, or local:/path for a file on every worker node.
You can try this way as well:
val conf = new SparkConf()
.setMaster('local[*]')
.setAppName('tmp')
.setJars(Array('/path1/one.jar', '/path2/two.jar'))
val sc = new SparkContext(conf)
and take a look here, check spark.jars option
and set "--jars" param in spark-submit:
--jars /path/1.jar,/path/2.jar
or edit conf/spark-defaults.conf:
spark.driver.extraClassPath /path/1.jar:/fullpath/2.jar
spark.executor.extraClassPath /path/1.jar:/fullpath/2.jar