How to fix Exception while running locally spark-sql program on windows10 by enabling HiveSupport?

谁说我不能喝 提交于 2019-12-20 07:26:03

问题


I am working on SPARK-SQL 2.3.1 and I am trying to enable the hiveSupport for while creating a session as below

.enableHiveSupport()
.config("spark.sql.warehouse.dir", "c://tmp//hive")

I ran below command

C:\Software\hadoop\hadoop-2.7.1\bin>winutils.exe chmod 777  C:\tmp\hive

While running my program getting:

Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

How to fix this issue and run my local windows machine?


回答1:


Try to use this command:

hadoop fs -chmod -R 777 /tmp/hive/

This is Spark Exception, not Windows. You need to set correct permissions for the HDFS folder, not only for your local directory.



来源:https://stackoverflow.com/questions/53100404/how-to-fix-exception-while-running-locally-spark-sql-program-on-windows10-by-ena

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!