unable to add spark to PYTHONPATH

主宰稳场 提交于 2020-01-13 09:56:08

问题


I am struggling to add spark to my python path:

(myenv)me@me /home/me$ set SPARK_HOME="/home/me/spark-1.2.1-bin-hadoop2.4"
(myenv)me@me /home/me$ set PYTHONPATH=$PYTHONPATH:$SPARK_HOME:$SPARK_HOME/python:$SPARK_HOME/python/build:$SPARK_HOME/bin

(myenv)me@me /home/me$ python -c 'import sys; print(sys.path)'
['', '/home/me/.virtualenvs/default/lib/python2.7', '/home/me/.virtualenvs/default/lib/python2.7/plat-x86_64-linux-gnu', '/home/me/.virtualenvs/default/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/lib/python2.7/lib-old', '/home/me/.virtualenvs/default/lib/python2.7/lib-dynload', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/local/lib/python2.7/site-packages', '/home/me/.virtualenvs/default/lib/python2.7/site-packages']

(myenv)me@me /home/me$ python -c 'import pyspark'
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ImportError: No module named pyspark

回答1:


I've got the same problem, but this helped.

Just add the following command in your .bashrc

export SPARK_HOME=/path/to/your/spark-1.4.1-bin-hadoop2.6
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH



回答2:


I think you mixed up PYTHONPATH and sys.path. But are you sure you need to modify PYTHONPATH if you have pyspark installed properly?

EDIT:

I haven't used pyspark, but would this help? importing pyspark in python shell



来源:https://stackoverflow.com/questions/28829757/unable-to-add-spark-to-pythonpath

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!