Download file weekly from FTP to HDFS

被刻印的时光 ゝ 提交于 2019-12-04 07:46:06

Since you're using CDH5, it's worth noting that the NFSv3 interface to HDFS is included in that Hadoop distribution. You should check for "Configuring an NFSv3 Gateway" in the CDH5 Installation Guide documentation.

Once that's done, you could use wget, curl, python, etc. to put the file onto the NFS mount. You probably want to do this through Oozie ... go into the job Designer and create a copy of the "Shell" command. Put in the command that you've selected to do the data transfer (python script, curl, ftp, etc), and parameterize the job using ${myVar}.

It's not perfect, but I think it's fairly elegant.

I suppose you want to pull a file.

One simple solution is that you can use coordinator which runs a workflow.

Workflow should have shell action

http://oozie.apache.org/docs/3.3.0/DG_ShellActionExtension.html

The script in that can just have

wget http://myftp.com/file.name

You can do all what you want in script

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!