Error in starting namenode in hadoop 2.4.1

做~自己de王妃 提交于 2019-12-01 01:52:33

Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)

Edit your .bashrc file and add the following lines:

export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

And although your ssh should be working by what you have just said, do it again just in case:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...

export HADOOP_INSTALL=/path/to/hadoop/

please check your HADOOP_CONF_DIR (most likely in bashrc) It should be pointing to $HADOOP_HOME/etc/hadoop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!