Error in starting namenode in hadoop 2.4.1

后端 未结 4 1432
长发绾君心
长发绾君心 2021-01-07 10:12

When I try to start dfs using:

start-dfs.sh

I get an error saying :

14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to         


        
相关标签:
4条回答
  • 2021-01-07 10:23

    Edit your .bashrc file and add the following lines:

    export HADOOP_HOME=path_to_your_hadoop_folder
    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    

    And although your ssh should be working by what you have just said, do it again just in case:

    ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
    cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    
    0 讨论(0)
  • 2021-01-07 10:25

    Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


    Do it by replacing in your etc/hadoop/hadoop-env.sh line:

    export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
    

    with:

    export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"
    


    (This solution has been found on Sumit Chawla's blog)

    0 讨论(0)
  • 2021-01-07 10:27

    It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...

    export HADOOP_INSTALL=/path/to/hadoop/
    
    0 讨论(0)
  • 2021-01-07 10:41

    please check your HADOOP_CONF_DIR (most likely in bashrc) It should be pointing to $HADOOP_HOME/etc/hadoop

    0 讨论(0)
提交回复
热议问题