问题
I have installed spark the release: spark-2.2.0-bin-hadoop2.7
.
I'm using Windows 10
OS
My java version 1.8.0_144
I have set my environment variables:
SPARK_HOME D:\spark-2.2.0-bin-hadoop2.7
HADOOP_HOME D:\Hadoop ( where I put bin\winutils.exe )
PYSPARK_DRIVER_PYTHON ipython
PYSPARK_DRIVER_PYTHON_OPTS notebook
Path is D:\spark-2.2.0-bin-hadoop2.7\bin
When I launch pyspark
from command line I have this error:
ipython is not recognized as an internal or external command
I tried also to set PYSPARK_DRIVER_PYTHON
in jupyter
but and it's giving me the same error (not recognized as an internal or external command).
Any help please?
回答1:
Search in your machine the ipython application, in my case it is in "c:\Anaconda3\Scripts". Then just add that path to the PATH Environment Variables
回答2:
On Windows 10 with Anaconda installed , please use Anaconda prompt rather that windows cmd and launch , jupyter notebook using below command
pyspark --master local[2]
Please make sure all configurations as mentioned in question are done.
来源:https://stackoverflow.com/questions/47364535/ipython-is-not-recognized-as-an-internal-or-external-command-pyspark