“ hadoop fs -ls ” listing files in the present working directory

断了今生、忘了曾经 提交于 2020-01-24 08:50:10

问题


I am following the Udacity's course on Hadoop which instructs using the command hadoop fs -ls to list files. But on my machine running Ubuntu, it instead list files in the present working directory. What am I doing wrong?

which hadoop commands gives the output: /home/usrname/hadoop-2.5.1//hadoop

Are the double slashes in the path the cause of this problem?


回答1:


You file system must be pointing to local file system. Just modify the configuration to point it to HDFS and restart the processes.

Check this configuration:

 <property>
    <name>fs.default.name</name>
    <value>hdfs://<IP>:<Port></value>
</property>



回答2:


You have to setup path for hadoop root folder in your current users .bashrc file something as

export HADOOP_HOME=/home/seo/hadoop/hadoop-1.2.1

then add it to your system path variable as

export PATH=$PATH:$HADOOP_HOME/bin:

And then when you use

hadoop fs -ls

will list your hdfs file system file if your hadoop cluster is up and running.




回答3:


It's likely that your client is not picking up the correct hadoop configuration files which is why it defaults to your local filesystem.

Set HADOOP_CONF_DIR to the directory of the hadoop configuration files. Also verify that fs.defaultFS is specified correctly in core-site.xml.




回答4:


Can you please try running the below command? Please do so after checking that the configuration suggested by Ashish is available in your core-site.xml.

hadoop dfs -ls hdfs://IP:PORT/

Thanks Arani



来源:https://stackoverflow.com/questions/27684746/hadoop-fs-ls-listing-files-in-the-present-working-directory

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!