We can't say definitively without knowing what's in your script.
For instance, if you're doing this:
# DON'T DO THIS: Violates http://mywiki.wooledge.org/DontReadLinesWithFor
for line in $(cat); do
: ...do something with "$line"...
done
...that'll wait until all stdin is available, resulting in the hang you describe.
However, if you're following best practices (per BashFAQ #1), your code will operate more like this:
while IFS= read -r line; do
: ...do something with "$line"
done
...and that'll actually behave properly, subject to any buffering performed by the writer. For hints on controlling buffering, see BashFAQ #9.
Finally, quoting from DontReadLinesWithFor:
The final issue with reading lines with for
is inefficiency. A while read
loop reads one line at a time from an input stream; $(<afile)
slurps the entire file into memory all at once. For small files, this is not a problem, but if you're reading large files, the memory requirement will be enormous. (Bash will have to allocate one string to hold the file, and another set of strings to hold the word-split results... essentially, the memory allocated will be twice the size of the input file.)
Obviously, if the content is indefinite, the memory requirements and completion time are likewise.