问题
I install hadoop 3.1.1 by homebrew on mac os.
core-site.xml config as following:
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
hdfs-site.xml as following:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/names</value>
</property>
<property>
<name>fs.checkpoint.dir</name>
<value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/checkpoint</value>
</property>
<property>
<name>fs.checkpoint.edits.dir</name>
<value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/checkpoint</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/data</value>
</property>
</configuration>
when I startdfs, the name node cannot start. from the log, I can see following exception:
2018-09-26 09:49:47,576 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown complete.
2018-09-26 09:49:47,583 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
java.io.IOException: Could not parse line: Filesystem 1024-blocks Used Available Capacity Mounted on
at org.apache.hadoop.fs.DF.parseOutput(DF.java:195)
at org.apache.hadoop.fs.DF.getFilesystem(DF.java:76)
at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker$CheckedVolume.<init>(NameNodeResourceChecker.java:69)
at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.addDirToCheck(NameNodeResourceChecker.java:165)
at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.<init>(NameNodeResourceChecker.java:134)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startCommonServices(FSNamesystem.java:1155)
at org.apache.hadoop.hdfs.server.namenode.NameNode.startCommonServices(NameNode.java:788)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:714)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:937)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:910)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1710)
I search a log and cannot find any usefull info.
回答1:
Running 3.1.1 also from Homebrew... (I'm also using Java 10, but I would suggest setting JAVA_HOME in hadoop-env.sh to be Java 8 otherwise)
If I
From the question, replace
/Users/yishuihanxiao/Personal_Home/ws/DB_Datawith/tmpso that my files are under/tmp/hadoop/hdfsAgain, from the question, remove
fs.default.namefromhdfs-site.xmlbecause it shouldn't be a property thereIndividually run
hdfs namenode -format, then start withhdfs namenode, then the namenode starts. I can access the NameNode UI, but there are no datanodes.Open separate terminal window for running
hdfs datanode.
Then that starts okay and joins the namenode process without error (shown in logs and UI), and I can go to the Web UI's for both namenode and datanode, and do other Hadoop tasks such as startup YARN
来源:https://stackoverflow.com/questions/52509542/macos-hadoop-3-1-1-failed-to-start-namenode-java-io-ioexception-could-not-pa