Passing command line arguments to Spark-shell

后端 未结 3 1188
耶瑟儿~
耶瑟儿~ 2020-12-05 03:23

I have a spark job written in scala. I use

spark-shell -i 

to run the job. I need to pass a command-line argument to the

3条回答
  •  没有蜡笔的小新
    2020-12-05 03:40

    My solution is use a customized key to define arguments instead of spark.driver.extraJavaOptions, in case someday you pass in a value that may interfere JVM's behavior.

    spark-shell -i your_script.scala --conf spark.driver.args="arg1 arg2 arg3"
    

    You can access the arguments from within your scala code like this:

    val args = sc.getConf.get("spark.driver.args").split("\\s+")
    args: Array[String] = Array(arg1, arg2, arg3)
    

提交回复
热议问题