Unable to run pyspark

后端 未结 5 1476
被撕碎了的回忆
被撕碎了的回忆 2020-12-13 10:32

I installed Spark on Windows, and I\'m unable to start pyspark. When I type in c:\\Spark\\bin\\pyspark, I get the following error:

5条回答
  •  予麋鹿
    予麋鹿 (楼主)
    2020-12-13 11:27

    Spark 2.1.0 doesn't support python 3.6.0. To solve this change your python version in anaconda environment. Run following command in your anaconda env

    conda create -n py35 python=3.5 anaconda
    activate py35
    

提交回复
热议问题