Hadoop master cannot start slave with different $HADOOP_HOME

 ̄綄美尐妖づ 提交于 2019-12-13 18:26:01

问题


In master, the $HADOOP_HOME is /home/a/hadoop, the slave's $HADOOP_HOME is /home/b/hadoop

In master, when I try to using start-all.sh, then the master name node start successfuly, but fails to start slave's data node with following message:

b@192.068.0.2: bash: line 0: cd: /home/b/hadoop/libexec/..: No such file or directory
b@192.068.0.2: bash: /home/b/hadoop/bin/hadoop-daemon.sh: No such file or directory

any idea on how to specify the $HADOOP_HOME for slave in master configuration?


回答1:


I don't know of a way to configure different home directories for the various slaves from the master, but the Hadoop FAQ says that the Hadoop framework does not require ssh and that the DataNode and TaskTracker daemons can be started manually on each node.

I would suggest writing you own scripts to start things that take into account the specific environments of your nodes. However, make sure to include all the slaves in the master's slave file. It seems that this is necessary and that the heart beats are not enough for a master to add slaves.



来源:https://stackoverflow.com/questions/12207795/hadoop-master-cannot-start-slave-with-different-hadoop-home

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!