I have a spark job written in scala. I use
spark-shell -i
to run the job. I need to pass a command-line argument to the
My solution is use a customized key to define arguments instead of spark.driver.extraJavaOptions, in case someday you pass in a value that may interfere JVM's behavior.
spark-shell -i your_script.scala --conf spark.driver.args="arg1 arg2 arg3"
You can access the arguments from within your scala code like this:
val args = sc.getConf.get("spark.driver.args").split("\\s+")
args: Array[String] = Array(arg1, arg2, arg3)