I have an Spark app which runs with no problem in local mode,but have some problems when submitting to the Spark cluster.
The error msg are as follows:
If you are using following code
val sc = new SparkContext(master, "WordCount", System.getenv("SPARK_HOME"))
Then replace with following lines
val jobName = "WordCount";
val conf = new SparkConf().setAppName(jobName);
val sc = new SparkContext(conf)
In Spark 2.0 you can use following code
val spark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.master("local[*]")// need to add
.getOrCreate()
You need to add .master("local[*]") if runing local here * means all node , you can say insted of 8 1,2 etc
You need to set Master URL if on cluster