Why does a job fail with “No space left on device”, but df says otherwise?

后端 未结 8 1253
無奈伤痛
無奈伤痛 2020-12-04 15:51

When performing a shuffle my Spark job fails and says \"no space left on device\", but when I run df -h it says I have free space left! Why does this happen, a

8条回答
  •  -上瘾入骨i
    2020-12-04 16:19

    I encountered a similar problem. By default, spark uses "/tmp" to save intermediate files. When the job is running, you can tab df -h to see the used space of fs mounted at "/" growing up. When the space of the dev is runned out of, this exception is thrown. To solve the problem, I set the SPARK_LOCAL_DIRS in the SPARK_HOME/conf/spark_defaults.conf with a path in a fs leaving enough space.

提交回复
热议问题