I installed Spark on Windows, and I\'m unable to start pyspark. When I type in c:\\Spark\\bin\\pyspark, I get the following error:
pyspark
c:\\Spark\\bin\\pyspark
Spark 2.1.0 doesn't support python 3.6.0. To solve this change your python version in anaconda environment. Run following command in your anaconda env
conda create -n py35 python=3.5 anaconda activate py35