Unable to run pyspark

后端 未结 5 1426
被撕碎了的回忆
被撕碎了的回忆 2020-12-13 10:32

I installed Spark on Windows, and I\'m unable to start pyspark. When I type in c:\\Spark\\bin\\pyspark, I get the following error:

5条回答
  •  既然无缘
    2020-12-13 11:22

    I resolved this issue using one change in the pythons script.

    I have place below piece of code in python script named serializers.py , location is c:\your-installation-dir\spark-2.0.2-bin-hadoop-2.7\python\pyspark\ and below line to be replace at Line number 381.

    cls = _old_namedtuple(*args, **kwargs, verbose=False, rename=False, module=None)
    

    And then run pyspark into your command line this will work..

提交回复
热议问题