Parsing output of ls
to iterate through list of files is bad. So how should I go about iterating through list of files in order by which they were first created
Here's a way using stat
with an associative array.
n=0
declare -A arr
for file in *; do
# modified=$(stat -f "%m" "$file") # For use with BSD/OS X
modified=$(stat -c "%Y" "$file") # For use with GNU/Linux
# Ensure stat timestamp is unique
if [[ $modified == *"${!arr[@]}"* ]]; then
modified=${modified}.$n
((n++))
fi
arr[$modified]="$file"
done
files=()
for index in $(IFS=$'\n'; echo "${!arr[*]}" | sort -n); do
files+=("${arr[$index]}")
done
Since sort
sorts lines, $(IFS=$'\n'; echo "${!arr[*]}" | sort -n)
ensures the indices of the associative array get sorted by setting the field separator in the subshell to a newline.
The quoting at arr[$modified]="${file}"
and files+=("${arr[$index]}")
ensures that file names with caveats like a newline are preserved.
With all of the cautions and warnings against using ls
to parse a directory notwithstanding, we have all found ourselves in this situation. If you do find yourself needing sorted directory input, then about the cleanest use of ls
to feed your loop is ls -opts | read -r name; do...
This will handle spaces in filenames, etc.. without requiring a reset of IFS
due to the nature of read
itself. Example:
ls -1rt | while read -r fname; do # where '1' is ONE not little 'L'
So do look for cleaner solutions avoiding ls
, but if push comes to shove, ls -opts
can be used sparingly without the sky falling or dragons plucking your eyes out.
let me add the disclaimer to keep everyone happy. If you like newlines
inside your filenames -- then do not use ls
to populate a loop. If you do not have newlines
inside your filenames, there are no other adverse side-effects.
Contra: TLDP Bash Howto Intro:
#!/bin/bash
for i in $( ls ); do
echo item: $i
done
It appears that SO users do not know what the use of contra means -- please look it up before downvoting.