The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- (on Windows)

后端 未结 16 1253
星月不相逢
星月不相逢 2020-11-28 04:45

I am running Spark on Windows 7. When I use Hive, I see the following error

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a         


        
16条回答
  •  余生分开走
    2020-11-28 05:19

    I was getting the same error "The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-" on Windows 7. Here is what I did to fix the issue:

    1. I had installed Spark on C:\Program Files (x86)..., it was looking for /tmp/hive under C: i.e., C:\tmp\hive
    2. I downloaded WinUtils.exe from https://github.com/steveloughran/winutils. I chose a version same as what I chose for hadoop package when I installed Spark. i.e., hadoop-2.7.1 (You can find the under the bin folder i.e., https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin)
    3. Now used the following command to make the c:\tmp\hive folder writable winutils.exe chmod 777 \tmp\hive

    Note: With a previous version of winutils too, the chmod command was setting the required permission without error, but spark still complained that the /tmp/hive folder was not writable.

提交回复
热议问题