Spark 2.1 - Error While instantiating HiveSessionState

后端 未结 10 1990
借酒劲吻你
借酒劲吻你 2020-12-06 06:38

With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.

Traceback (most recent call last):
File \"/usr/local/spark/pytho         


        
10条回答
  •  时光说笑
    2020-12-06 07:13

    I was getting same error in windows environment and Below trick worked for me.

    in shell.py the spark session is defined with .enableHiveSupport()

     spark = SparkSession.builder\
                .enableHiveSupport()\
                .getOrCreate()
    

    Remove hive support and redefine spark session as below:

    spark = SparkSession.builder\
            .getOrCreate()
    

    you can find shell.py in your spark installation folder. for me it's in "C:\spark-2.1.1-bin-hadoop2.7\python\pyspark"

    Hope this helps

提交回复
热议问题