I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it gi
For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.
It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:
<property>
  <name>dfs.permissions.enabled</name>
  <value>false</value>
</property>
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml
Start a shell as hduser (from root) and run your command
sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input
[update]
Also note that the hdfs user is the super user and has all r/w privileges.
I've solved this problem by using following steps
su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit
                                                                        You are experiencing two separate problems here:
hduser@ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)
Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.
hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.
To fix this, I suggest you change the permissions on the local data:
sudo chmod -R og+rx /usr/local/input-data/
Then, try the put command again as hduser.
I solved this problem temporary by disabling the dfs permission.By adding below property code to conf/hdfs-site.xml
<property>
  <name>dfs.permissions</name>
  <value>false</value>
</property>
                                                                        I had similar situation and here is my approach which is somewhat different:
 HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /
What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).
Advantages:
sudo.