I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using
\"./bin/spark-shell\".
It has the error below.
There are two errors I think.
For 1. I tried:
But neither worked. Then I tried the following and it worked:
val conf = new SparkConf().
setAppName("SparkExample").
setMaster("local[*]").
set("spark.driver.bindAddress","127.0.0.1")
val sc = new SparkContext(conf)
For 2. you can try:
sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()
and then import sqlContext.implicits._
The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.