ipython is not recognized as an internal or external command (pyspark)

删除回忆录丶 提交于 2020-12-26 07:44:47

问题


I have installed spark the release: spark-2.2.0-bin-hadoop2.7.

I'm using Windows 10 OS

My java version 1.8.0_144

I have set my environment variables:

SPARK_HOME D:\spark-2.2.0-bin-hadoop2.7

HADOOP_HOME D:\Hadoop ( where I put bin\winutils.exe )

PYSPARK_DRIVER_PYTHON ipython

PYSPARK_DRIVER_PYTHON_OPTS notebook

Path is D:\spark-2.2.0-bin-hadoop2.7\bin

When I launch pyspark from command line I have this error:

ipython is not recognized as an internal or external command

I tried also to set PYSPARK_DRIVER_PYTHON in jupyter but and it's giving me the same error (not recognized as an internal or external command).

Any help please?


回答1:


Search in your machine the ipython application, in my case it is in "c:\Anaconda3\Scripts". Then just add that path to the PATH Environment Variables




回答2:


On Windows 10 with Anaconda installed , please use Anaconda prompt rather that windows cmd and launch , jupyter notebook using below command

  pyspark --master local[2]

Please make sure all configurations as mentioned in question are done.



来源:https://stackoverflow.com/questions/47364535/ipython-is-not-recognized-as-an-internal-or-external-command-pyspark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!