How to fix Exception while running locally spark-sql program on windows10 by enabling HiveSupport?
问题 I am working on SPARK-SQL 2.3.1 and I am trying to enable the hiveSupport for while creating a session as below .enableHiveSupport() .config("spark.sql.warehouse.dir", "c://tmp//hive") I ran below command C:\Software\hadoop\hadoop-2.7.1\bin>winutils.exe chmod 777 C:\tmp\hive While running my program getting: Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive