Spark spark-submit --jars arguments wants comma list, how to declare a directory of jars?

后端 未结 2 1510
旧巷少年郎
旧巷少年郎 2020-11-27 07:32

In Submitting Applications in the Spark docs, as of 1.6.0 and earlier, it\'s not clear how to specify the --jars argument, as it\'s apparently not a colon-separated classpat

2条回答
  •  一向
    一向 (楼主)
    2020-11-27 08:20

    One way (the only way?) to use the --jars argument is to supply a comma-separated list of explicitly named jars. The only way I figured out to use the commas was a StackOverflow answer that led me to look beyond the docs to the command line:

    spark-submit --help 
    

    The output from that command contains:

     --jars JARS                 Comma-separated list of local jars to include on the driver
                                  and executor classpaths. 
    

    Today when I was testing --jars, we had to explicitly provide a path to each jar:

    /usr/local/spark/bin/spark-submit --class jpsgcs.thold.PipeLinkageData ---jars=local:/usr/local/spark/jars/groovy-all-2.3.3.jar,local:/usr/local/spark/jars/guava-14.0.1.jar,local:/usr/local/spark/jars/jopt-simple-4.6.jar,local:/usr/local/spark/jars/jpsgcs-core-1.0.8-2.jar,local:/usr/local/spark/jars/jpsgcs-pipe-1.0.6-7.jar /usr/local/spark/jars/thold-0.0.1-1.jar
    

提交回复
热议问题