When I try to start dfs using:
start-dfs.sh
I get an error saying :
14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to
Edit your .bashrc file and add the following lines:
export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
And although your ssh should be working by what you have just said, do it again just in case:
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.
Do it by replacing in your etc/hadoop/hadoop-env.sh line:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
with:
export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"
(This solution has been found on Sumit Chawla's blog)
It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...
export HADOOP_INSTALL=/path/to/hadoop/
please check your HADOOP_CONF_DIR (most likely in bashrc) It should be pointing to $HADOOP_HOME/etc/hadoop