spark fail in windows: <console>:16: error: not found: value sqlContext

孤人 提交于 2019-12-01 08:03:47

I had exactly the same issue and went through a number of possible solutions as explained in the links you posted, but nothing worked at the time. By running the spark-shell command, it creates the tmp\hive directory in C: and eventually found out that there was an issue with the permissions. I have made sure that my HADOOP_HOME was correctly set up and it contained the \bin\winutils.exe, then simply moved tmp\hive under %HADOOP_HOME%\bin and restarted the command prompt. This finally resolved the issue, but do remember to run the cmd as an administrator. Hope this helps, I.

I was facing the same issue, after investigation i observed there was the compatibility issue between spark version and winutils.exe of hadoop-2.x.x.

After experiment i suggest you to use hadoop-2.7.1 winutils.exe with spark-2.2.0-bin-hadoop2.7 version and hadoop-2.6.0 winutils.exe with spark-1.6.0-bin-hadoop2.6 version and set below environment variables

SCALA_HOME  : C:\Program Files (x86)\scala2.11.7;
JAVA_HOME   : C:\Program Files\Java\jdk1.8.0_51
HADOOP_HOME : C:\Hadoop\winutils-master\hadoop-2.7.1
SPARK_HOME  : C:\Hadoop\spark-2.2.0-bin-hadoop2.7
PATH    : %JAVA_HOME%\bin;%SCALA_HOME%\bin;%HADOOP_HOME%\bin;%SPARK_HOME%\bin;

Create C:\tmp\hive diroctory and grant access permission using below command

C:\Hadoop\winutils-master\hadoop-2.7.1\bin>winutils.exe chmod -R 777 C:\tmp\hive

Remove metastore_db directory from below path if exist.

C:\Users\<User_Name>\metastore_db

Use below command to start spark shell

C:>spark-shell

I too had similar issue (import spark.implicits._ not found) for windows 10.
As suggested in the post
I set
1. HADOOP_HOME (%HADOOP_HOME%/bin/winutils.exe - should exist)
2. %HADOOP_HOME%/bin/winutils.exe chmod -R 777 F:\tmp\hive (spark-2.1.1 and winutils(hadoop 2.7.1) on same F: drive, in my case) enter image description here

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!