Add Yarn cluster configuration to Spark application
问题 I'm trying to use spark on yarn in a scala sbt application instead of using spark-submit directly. I already have a remote yarn cluster running and I can connect to the yarn cluster run spark jobs in SparkR. But when I try to do similar thing in a scala application it couldn't load my environment variables to yarn configurations and instead use default yarn address and port. The sbt application is just a simple object: object simpleSparkApp { def main(args: Array[String]): Unit = { val conf =