Passing command line arguments to Spark-shell

后端 未结 3 1187
耶瑟儿~
耶瑟儿~ 2020-12-05 03:23

I have a spark job written in scala. I use

spark-shell -i 

to run the job. I need to pass a command-line argument to the

3条回答
  •  心在旅途
    2020-12-05 03:54

    I use the extraJavaOptions when I have a scala script which is too simple to go through the build process but I still need to pass arguments to it. It's not beautiful, but it works and you can quickly pass multiple arguments:

    spark-shell -i your_script.scala --conf spark.driver.extraJavaOptions="-Darg1,arg2,arg3"
    

    Note that -D does not belong to the arguments, which are arg1, arg2, and arg3. You can then access the arguments from within your scala code like this:

    val sconf = new SparkConf()
    
    // load string
    val paramsString = sconf.get("spark.driver.extraJavaOptions")
    
    // cut off `-D`
    val paramsSlice = paramsString.slice(2,paramsString.length)
    
    // split the string with `,` as delimiter and save the result to an array
    val paramsArray = paramsSlice.split(",")
    
    // access parameters
    val arg1 = paramsArray(0)
    

提交回复
热议问题