Hadoop Windows setup. Error while running WordCountJob: “No space available in any of the local directories”

半腔热情 提交于 2021-01-27 06:32:16

问题


I am following this video tutorial trying to set up hadoop on my machine.

  • How to Install Hadoop on Windows 10

I've setup it successfuly: no errors while executing start-all.xml from sbin directory.

But when I am trying to execute my WordCount.jar file there is an error ocurred:

2/23 11:42:59 INFO localizer.ResourceLocalizationService: Created localizer for container_1550911199370_0001_02_000001
19/02/23 11:42:59 INFO localizer.ResourceLocalizationService: Localizer failed
org.apache.hadoop.util.DiskChecker$DiskErrorException: No space available in any of the local directories.
        at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:399)
        at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:151)
        at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:132)
        at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:116)
        at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.getLocalPathForWrite(LocalDirsHandlerService.java:545)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerRunner.run(ResourceLocalizationService.java:1142)
19/02/23 11:42:59 ERROR nodemanager.DeletionService: Exception during execution of task in DeletionService
java.lang.NullPointerException: path cannot be null
        at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:204)
        at org.apache.hadoop.fs.FileContext.fixRelativePart(FileContext.java:281)
        at org.apache.hadoop.fs.FileContext.delete(FileContext.java:769)
        at org.apache.hadoop.yarn.server.nodemanager.DeletionService$FileDeletionTask.run(DeletionService.java:273)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
19/02/23 11:42:59 INFO container.ContainerImpl: Container container_1550911199370_0001_02_000001 transitioned from LOCAL

I am sure that I have enough space for processing job. My system is fresh installed:

Configuration info:

Here is my configuration files:

core-site.xml

<configuration>
   <property>
       <name>fs.defaultFS</name>
       <value>hdfs://localhost:9000</value>
   </property>
</configuration>

hdfs-site.xml

<configuration>
   <property>
       <name>dfs.replication</name>
       <value>1</value>
   </property>
   <property>
       <name>dfs.namenode.name.dir</name>
       <value>file:///C:/hadoop-2.8.0/data/namenode</value>
   </property>
   <property>
       <name>dfs.datanode.data.dir</name>
       <value>file:///C:/hadoop-2.8.0/data/datanode</value>
   </property>
</configuration>

mapred-site.xml

<configuration>
   <property>
       <name>mapreduce.framework.name</name>
       <value>yarn</value>
   </property>
</configuration>

yarn-site.xml

<configuration>
   <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
   </property>
   <property>
        <name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>  
    <value>org.apache.hadoop.mapred.ShuffleHandler</value>
   </property>
   <property>
   <name>yarn.nodemanager.disk-health-checker.enable</name>
   <value>false</value>
</property>
</configuration>

Here is how I am executing the jar (with preparing input/output dirs):

hadoop fs -mkdir /top
hadoop fs -mkdir /top/input
hadoop fs -mkdir /top/output
hadoop -put C:/hadoop-2.8.0/wordcount2.txt /top/input
hadoop jar C:/hadoop-2.8.0/WordCount.jar /top/input/wordcount2.txt /top/output/output.txt

回答1:


Try formatting your name node and datanode.




回答2:


The main error is:

org.apache.hadoop.util.DiskChecker$DiskErrorException: No space available in any of the local directories.

In order to fix this issue you can try to:

(1) Change directory format in Hdfs-site.xml

In the hdfs-site.xml file try replacing the following values:

<configuration>
   <property>
       <name>dfs.replication</name>
       <value>1</value>
   </property>
   <property>
       <name>dfs.namenode.name.dir</name>
       <value>file:///C:/hadoop-2.8.0/data/namenode</value>
   </property>
   <property>
       <name>dfs.datanode.data.dir</name>
       <value>file:///C:/hadoop-2.8.0/data/datanode</value>
   </property>
</configuration>

with

<configuration>
   <property>
       <name>dfs.replication</name>
       <value>1</value>
   </property>
   <property>
       <name>dfs.namenode.name.dir</name>
       <value>C:\hadoop-2.8.0\data\namenode</value>
   </property>
   <property>
       <name>dfs.datanode.data.dir</name>
       <value>C:\hadoop-2.8.0\data\datanode</value>
   </property>
</configuration>

(2) Directories read & write permissions

Check that the current user has the permission to read and write into the hadoop directory.

(3) Node manager directories

Try adding the following properties into yarn-site.xml file:

<property>
<name>yarn.nodemanager.local-dirs</name>
<value>C:/hadoop-2.8.0/yarn/local</value>
</property>
<property>
<name>yarn.nodemanager.log-dirs</name>
<value>C:/hadoop-2.8.0/yarn/logs</value>
</property>

After changing the directories, try to format the namenode.

If it still doesn't work, you can refer to the following step by step guide to install Hadoop on windows, it works fine for me:

  • Step by step Hadoop 2.8.0 installation on Window 10
  • How to Run Hadoop wordcount MapReduce Example on Windows 10


来源:https://stackoverflow.com/questions/54840463/hadoop-windows-setup-error-while-running-wordcountjob-no-space-available-in-a

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!