I have 3 data nodes running, while running a job i am getting the following given below error ,
java.io.IOException: File /user/ashsshar/olhcache/load
What I usually do when this happens is that I go to tmp/hadoop-username/dfs/ directory and manually delete the data and name folders (assuming you are running in a Linux environment).
Then format the dfs by calling bin/hadoop namenode -format (make sure that you answer with a capital Y when you are asked whether you want to format; if you are not asked, then re-run the command again).
You can then start hadoop again by calling bin/start-all.sh