The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- (on Windows)

后端 未结 16 1189
星月不相逢
星月不相逢 2020-11-28 04:45

I am running Spark on Windows 7. When I use Hive, I see the following error

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a         


        
16条回答
  •  栀梦
    栀梦 (楼主)
    2020-11-28 05:13

    This is a simple 4 step process:

    For Spark 2.0+:

    1. Download Hadoop for Windows / Winutils
    2. Add this to your code (before SparkSession initialization):

      if(getOS()=="windows"){
          System.setProperty("hadoop.home.dir", "C:/Users//winutils-master/hadoop-2.7.1"); 
      }   
      
    3. Add this to your spark-session (You can change it to C:/Temp instead of Desktop).

      .config("hive.exec.scratchdir","C:/Users//Desktop/tmphive")
      
    4. Open cmd.exe and run:

      "path\to\hadoop-2.7.1\bin\winutils.exe" chmod 777 C:\Users\\Desktop\tmphive
      

提交回复
热议问题