Errors while running hadoop

▼魔方 西西 提交于 2019-12-03 08:12:35

I had similar issues - Actually Hadoop was binding to IPv6. Then I Added - "export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true " to $HADOOP_HOME/conf/hadoop-env.sh

Hadoop was binding to IPv6 even when I had disabled IPv6 on my system. Once I added it to env, started working fine.

Hope this helps someone.

Try to do ssh to your local system using the IP, in this case:

$ ssh 127.0.0.1

Once you are able to do the ssh successfully. Run the below command to know the list of open ports

~$ lsof -i

look for a listening connector with name: localhost:< PORTNAME > (LISTEN)

copy this < PORTNAME > and replace the existing value of port number in tag of fs.default.name property in your core-site.xml in the hadoop conf folder

save the core-site.xml, this should resolve the issue.

NameNode (NN) maintains the namespace for HDFS and it should be running for filesystem operations on HDFS. Check the logs why the NN hasn't started. TaskTracker is not required for operations on HDFS, only NN and DN are sufficient. Check the http://goo.gl/8ogSk and http://goo.gl/NIWoK tutorials on how to setup Hadoop on a single and multi node.

Mayuresh Gadge

All the files in the bin are exectuables. Just copy the command and paste it in the terminal. Make sure the address is right, i.e. the user must be replaced by something. That would do the trick.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!