Spark 2.1 - Error While instantiating HiveSessionState

后端 未结 10 1977
借酒劲吻你
借酒劲吻你 2020-12-06 06:38

With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.

Traceback (most recent call last):
File \"/usr/local/spark/pytho         


        
10条回答
  •  失恋的感觉
    2020-12-06 07:27

    The issue for me was solved by disabling HADOOP_CONF_DIR environment variable. It was pointing to hadoop configuration directory and while starting pyspark shell, the variable caused spark to initiate hadoop cluster which wasn't initiated.

    So if you have HADOOP_CONF_DIR variable enabled, then you have to start hadoop cluster started before using spark shells

    Or you need to disable the variable.

提交回复
热议问题