java.io.IOException: Cannot obtain block length for LocatedBlock

岁酱吖の 提交于 2019-12-06 13:35:09

Here is a good description of the problem and its cause:

https://community.hortonworks.com/answers/37414/view.html

For us running the command hdfs debug recoverLease -path <path-of-the-file> -retries 3 solved the problem.

It is very hard to determine if the file in any HDFS folder is unclosed or not. You probably have to do a hdfs cat test on them. Or you can regularly check for lost file blocks (every hour or after every restart of cluster).

Tri Pham

I got the same issue with you. There are some files that opened by flume but never closed (I am not sure about the reason). You need to find the name of them by the command:

hdfs fsck /directory/of/locked/files/ -files -openforwrite

Then just removing them. Or you can try to recover files as command hdfs debug recoverLease -path <path-of-the-file> -retries 3 that Joe23 suggested.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!