I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using
\"./bin/spark-shell\".
It has the error below.
If you are using Scala to run the code in an IDE, and if you face the same issue and you are not using SparkConf() as pointed out above and using SparkSession() then you could bind the localhost address as follows as set only works in SparkConf(). You should use .config() to set the spark configuration as shown below:
val spark = SparkSession
.builder()
.appName("CSE512-Phase1")
.master("local[*]").config("spark.driver.bindAddress", "localhost")
.getOrCreate()