Writing to HDFS could only be replicated to 0 nodes instead of minReplication (=1)

后端 未结 7 1935
谎友^
谎友^ 2020-12-08 04:32

I have 3 data nodes running, while running a job i am getting the following given below error ,

java.io.IOException: File /user/ashsshar/olhcache/load

7条回答
  •  盖世英雄少女心
    2020-12-08 05:23

    Very Simple fix for the same issue on Windows 8.1
    I used Windows 8.1 OS and Hadoop 2.7.2, Did the following things to overcome this issue.

    1. When I started the hdfs namenode -format, I noticed there is a lock in my directory. please refer the figure below.
    2. Once I deleted the full folder as shown below, and again I did the hdfs namenode -format.
    3. After performing above two steps, I could successfully place my required files in HDFS system. I used start-all.cmd command to start yarn and namenode.

提交回复
热议问题