问题
I have spark installed. And, I can go into the bin
folder within my spark version, and run ./spark-shell
and it runs correctly.
But, for some reason, I am unable to launch pyspark
and any of the submodules.
So, I go into bin
and launch ./pyspark
and it tells me that my path is incorrect.
The current path I have for PYSPARK_PYTHON
is the same as where I'm running the pyspark
executable script from.
What is the correct path for PYSPARK_PYTHON
? Shouldn't it be the path that leads to the executable script called pyspark
in the bin
folder of the spark version?
That's the path that I have now, but it tells me env: <full PYSPARK_PYTHON path> no such file or directory
. Thanks.
回答1:
What is the correct path for PYSPARK_PYTHON? Shouldn't it be the path that leads to the executable script called pyspark in the bin folder of the spark version?
No, it shouldn't. It should point to a Python executable you want to use with Spark (for example output from which python
. If you don't want to use custom interpreter just ignore it. Spark will use the first Python interpreter available on your system PATH
.
来源:https://stackoverflow.com/questions/33533786/what-path-do-i-use-for-pyspark