Getting error while starting hadoop

纵饮孤独 提交于 2019-12-11 02:38:56

问题


I have Hadoop 0.20 , i am starting it by running $HADOOP/bin/start-all.sh. Every demon is running but while datanode it throws error

localhost: Unrecognized option: -jvm
localhost: Error: Could not create the Java Virtual Machine.
localhost: Error: A fatal exception has occurred. Program will exit.

But i have installed java.

[root@ulhshr1ld1 bin]# java -version
java version "1.7.0_01" Java(TM) SE Runtime Environment (build 1.7.0_01-b08)
Java HotSpot(TM) Server VM (build 21.1-b02, mixed mode)

And i am also able to access http://localhost:50070/dfshealth.jsp And http://localhost:50030/jobtracker.jsp

Can any one please guide me whats is the problem?


回答1:


It's a bug in Hadoop when run as root. It has been fixed in the newer releases. Here is the JIRA. Use the latest version of Hadoop.

Root gives complete access to the system. Create a separate user and start the daemons. Why start the daemons as root?

Also, Apache recommends to use Java 6 from Oracle. Look like you are running Java 7. Hadoop with Java 7 has not been tested thoroughly.




回答2:


The -jvm options should be passed to jsvc when we starting a secure datanode, but it still passed to java when start-dfs.sh is run by root while secure datanode is disabled.

This is a bug of bin/hdfs.or bin/hadoop.

apply the following patch.

HDFS-1943 patch.

diff --git bin/hdfs bin/hdfs index 76ff689..ce9dc0a 100755 --- bin/hdfs +++ bin/hdfs @@ -71,7 +71,7 @@ elif [ "$COMMAND" = "secondarynamenode" ] ; then >HADOOP_OPTS="$HADOOP_OPTS $HADOOP_SECONDARYNAMENODE_OPTS" elif [ "$COMMAND" = "datanode" ] ; then
CLASS='org.apache.hadoop.hdfs.server.datanode.DataNode' - if [[ $EUID -eq 0 ]]; then + if [ "$starting_secure_dn" = "true" ]; then HADOOP_OPTS="$HADOOP_OPTS -jvm server $HADOOP_DATANODE_OPTS" else HADOOP_OPTS="$HADOOP_OPTS -server $HADOOP_DATANODE_OPTS"

copy the above to a file named HDFS.patch.To apply patch run in terminal

patch -p0 < /path/to/patch/file /path/to/file/to/be/patched

/path/to/file/to/be/patched will be ../bin/hadoop or .../bin/hdfs



来源:https://stackoverflow.com/questions/8411917/getting-error-while-starting-hadoop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!