Macos Hadoop 3.1.1 - Failed to start namenode. java.io.IOException: Could not parse line: “Filesystem 1024-blocks Used Available Capacity Mounted on”

我的梦境 提交于 2019-12-20 07:10:48

问题


I install hadoop 3.1.1 by homebrew on mac os.

core-site.xml config as following:

<configuration>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/tmp</value>
        <description>A base for other temporary directories.</description>
    </property>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
  </property>
</configuration>

hdfs-site.xml as following:

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
  </property>
  <property>  
      <name>dfs.namenode.name.dir</name>  
      <value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/names</value>  
  </property>
  <property>
    <name>fs.checkpoint.dir</name>
    <value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/checkpoint</value>
  </property>
  <property>
    <name>fs.checkpoint.edits.dir</name>
    <value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/checkpoint</value>
  </property>
  <property>  
      <name>dfs.datanode.data.dir</name>  
      <value>file:///Users/yishuihanxiao/Personal_Home/ws/DB_Data/hadoop/hdfs/data</value>  
  </property>
</configuration>

when I startdfs, the name node cannot start. from the log, I can see following exception:

2018-09-26 09:49:47,576 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown complete.
2018-09-26 09:49:47,583 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
java.io.IOException: Could not parse line: Filesystem   1024-blocks     Used Available Capacity  Mounted on
    at org.apache.hadoop.fs.DF.parseOutput(DF.java:195)
    at org.apache.hadoop.fs.DF.getFilesystem(DF.java:76)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker$CheckedVolume.<init>(NameNodeResourceChecker.java:69)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.addDirToCheck(NameNodeResourceChecker.java:165)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.<init>(NameNodeResourceChecker.java:134)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startCommonServices(FSNamesystem.java:1155)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.startCommonServices(NameNode.java:788)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:714)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:937)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:910)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1643)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1710)

I search a log and cannot find any usefull info.


回答1:


Running 3.1.1 also from Homebrew... (I'm also using Java 10, but I would suggest setting JAVA_HOME in hadoop-env.sh to be Java 8 otherwise)

If I

  1. From the question, replace /Users/yishuihanxiao/Personal_Home/ws/DB_Data with /tmp so that my files are under /tmp/hadoop/hdfs

  2. Again, from the question, remove fs.default.name from hdfs-site.xml because it shouldn't be a property there

  3. Individually run hdfs namenode -format, then start with hdfs namenode, then the namenode starts. I can access the NameNode UI, but there are no datanodes.

  4. Open separate terminal window for running hdfs datanode.

Then that starts okay and joins the namenode process without error (shown in logs and UI), and I can go to the Web UI's for both namenode and datanode, and do other Hadoop tasks such as startup YARN



来源:https://stackoverflow.com/questions/52509542/macos-hadoop-3-1-1-failed-to-start-namenode-java-io-ioexception-could-not-pa

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!