Mac spark-shell Error initializing SparkContext

前端 未结 12 1068
礼貌的吻别
礼貌的吻别 2020-12-07 09:26

I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using

\"./bin/spark-shell\". 

It has the error below.

12条回答
  •  感动是毒
    2020-12-07 09:50

    There are two errors I think.

    1. Your spark local ip was not correct and needs to be change to 127.0.0.1.
    2. You didn't difine sqlContext properly.

    For 1. I tried:

    • 1) exported SPARK_LOCAL_IP="127.0.0.1" in ~/.bash_profile
    • 2) added export SPARK_LOCAL_IP="127.0.0.1" in load-spark-env.sh under $SPARK_HOME

    But neither worked. Then I tried the following and it worked:

    val conf = new SparkConf().
        setAppName("SparkExample").
        setMaster("local[*]").
        set("spark.driver.bindAddress","127.0.0.1")
    val sc = new SparkContext(conf)
    

    For 2. you can try:

    sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()
    

    and then import sqlContext.implicits._

    The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.

提交回复
热议问题