What path do I use for pyspark?

不羁的心 提交于 2020-01-05 15:45:17

问题


I have spark installed. And, I can go into the bin folder within my spark version, and run ./spark-shell and it runs correctly.

But, for some reason, I am unable to launch pyspark and any of the submodules.

So, I go into bin and launch ./pyspark and it tells me that my path is incorrect.

The current path I have for PYSPARK_PYTHON is the same as where I'm running the pyspark executable script from.

What is the correct path for PYSPARK_PYTHON? Shouldn't it be the path that leads to the executable script called pyspark in the bin folder of the spark version?

That's the path that I have now, but it tells me env: <full PYSPARK_PYTHON path> no such file or directory. Thanks.


回答1:


What is the correct path for PYSPARK_PYTHON? Shouldn't it be the path that leads to the executable script called pyspark in the bin folder of the spark version?

No, it shouldn't. It should point to a Python executable you want to use with Spark (for example output from which python. If you don't want to use custom interpreter just ignore it. Spark will use the first Python interpreter available on your system PATH.



来源:https://stackoverflow.com/questions/33533786/what-path-do-i-use-for-pyspark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!