I am writing a shell script to put data into hadoop as soon as they are generated. I can ssh to my master node, copy the files to a folder over there and then put them into
Try this (untested):
cat test.txt | ssh username@masternode "hadoop dfs -put - hadoopFoldername/test.txt"
I've used similar tricks to copy directories around:
tar cf - . | ssh remote "(cd /destination && tar xvf -)"
This sends the output of local-tar into the input of remote-tar.
tar