KeyError: SPARK_HOME during SparkConf initialization

无人久伴 提交于 2020-01-04 02:32:06

问题


I am a spark newbie and I want to run a Python script from the command line. I have tested pyspark interactively and it works. I get this error when trying to create the sc:

File "test.py", line 10, in <module>
    conf=(SparkConf().setMaster('local').setAppName('a').setSparkHome('/home/dirk/spark-1.4.1-bin-hadoop2.6/bin'))
  File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/conf.py", line 104, in __init__
    SparkContext._ensure_initialized()
  File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/context.py", line 229, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "/home/dirk/spark-1.4.1-bin-hadoop2.6/python/pyspark/java_gateway.py", line 48, in launch_gateway
    SPARK_HOME = os.environ["SPARK_HOME"]
  File "/usr/lib/python2.7/UserDict.py", line 23, in __getitem__
    raise KeyError(key)
KeyError: 'SPARK_HOME'

回答1:


It seems like there are two problems here.

The first one is a path you use. SPARK_HOME should point to the root directory of the Spark installation so in your case it should probably be /home/dirk/spark-1.4.1-bin-hadoop2.6 not /home/dirk/spark-1.4.1-bin-hadoop2.6/bin.

The second problem is a way how you use setSparkHome. If you check a docstring its goal is to

set path where Spark is installed on worker nodes

SparkConf constructor assumes that SPARK_HOME on master is already set. It calls pyspark.context.SparkContext._ensure_initialized which calls pyspark.java_gateway.launch_gateway, which tries to acccess SPARK_HOME and fails.

To deal with this you should set SPARK_HOME before you create SparkConf.

import os
os.environ["SPARK_HOME"] = "/home/dirk/spark-1.4.1-bin-hadoop2.6"
conf = (SparkConf().setMaster('local').setAppName('a'))


来源:https://stackoverflow.com/questions/31566250/keyerror-spark-home-during-sparkconf-initialization

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!