DataNode failing to Start in Hadoop

帅比萌擦擦* 提交于 2019-12-01 09:30:36

问题


I trying setup Hadoop install on Ubuntu 11.04 and Java 6 sun. I was working with hadoop 0.20.203 rc1 build. I am repeatedly running into an issue on Ubuntu 11.04 with java-6-sun. When I try to start the hadoop, the datanode doesn't start due to "Cannot access storage".

2011-12-22 22:09:20,874 INFO org.apache.hadoop.hdfs.server.common.Storage: Cannot lock storage /home/hadoop/work/dfs_blk/hadoop. The directory is already locked.
2011-12-22 22:09:20,896 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Cannot lock storage /home/hadoop/work/dfs_blk/hadoop. The directory is already locked.
        at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.lock(Storage.java:602)
        at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:455)
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:111)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:354)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:268)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1480)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1419)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1437)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1563)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1573)

I have tried upgrading and downgrading to couple of versions in 0.20 branch from Apache, even cloudera, also deleting and installing hadoop again. But Still I am running into this issue. Typical workarounds such as deleting *.pid files in /tmp directory is also not working. Could anybody point me to solution for this?


回答1:


Yes I formatted the namenode , the problem was in of the rogue templates for hdfs-site.xml that i copy pasted , the dfs.data.dir and dfs.name.dir pointed to the same directory location resulting Locked storage error. They should be different directories. Unfortunately, the hadoop documentation is not clear enough in this subtle details.



来源:https://stackoverflow.com/questions/8613112/datanode-failing-to-start-in-hadoop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!