unable to add spark to PYTHONPATH

被刻印的时光 ゝ 提交于 2019-12-05 10:04:33
ppk28

I've got the same problem, but this helped.

Just add the following command in your .bashrc

export SPARK_HOME=/path/to/your/spark-1.4.1-bin-hadoop2.6
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH
laike9m

I think you mixed up PYTHONPATH and sys.path. But are you sure you need to modify PYTHONPATH if you have pyspark installed properly?

EDIT:

I haven't used pyspark, but would this help? importing pyspark in python shell

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!