Transferring files from remote node to HDFS with Flume
I have a bunch of binary files compressed into *gz format. These are generated on a remote node and must be transferred to HDFS located one of the datacenter's server. I'm exploring the option of sending the files with Flume; I explore the option of doing this with a Spooling Directory configuration, but apparently this only works when the file's directory is located locally on the same HDFS node. Any suggestions how to tackle this problem? arghtype There is no out-of-box solution for such case. But you could try these workarounds: You could create your own source implementation for such