Transferring files from remote node to HDFS with Flume

你离开我真会死。 提交于 2019-12-05 02:03:19
arghtype

There is no out-of-box solution for such case. But you could try these workarounds:

  1. You could create your own source implementation for such purpose (by using Flume SDK). For example, this project seems to be able to connect to remote dir by ssh and use it as source.
  2. You could create a custom scheduled script to copy remote files into local spool directory periodically, then use it as a spooling dir source for flume agent.
  3. You could try to create another script to read your remote data and then to write it into its output and use such script in the Exec Source.
  4. You could locate your flume (and agent) on the machine, where data is located (see Can Spool Dir of flume be in remote machine? ).

Why don't you run two different Flume agents, one on the remote machine and one on your date node. The agent on your remote machine can read the spooling directory and send it to avro sink. And the agent on the datanode can read through avro source and dump the data to HDFS.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!