I installed Spark on Windows, and I\'m unable to start pyspark
. When I type in c:\\Spark\\bin\\pyspark
, I get the following error:
I resolved this issue using one change in the pythons script.
I have place below piece of code in python script named serializers.py , location is c:\your-installation-dir\spark-2.0.2-bin-hadoop-2.7\python\pyspark\
and below line to be replace at Line number 381.
cls = _old_namedtuple(*args, **kwargs, verbose=False, rename=False, module=None)
And then run pyspark into your command line this will work..