pyspark error does not exist in the jvm error when initializing SparkContext

后端 未结 10 1730
一向
一向 2021-01-07 22:32

I am using spark over emr and writing a pyspark script, I am getting an error when trying to

from pyspark import SparkContext
sc = SparkContext()
         


        
10条回答
  •  無奈伤痛
    2021-01-07 22:46

    PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me

    Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues

    Check the version of pyspark by using

    pip freeze

提交回复
热议问题