I have some files in linux. For example 2 and i need shuffling the files in one file.
For example
$cat file1
line 1
line 2
line 3
line 4
line 5
line
Here's a one-liner that doesn't rely on shuf or sort -R, which I didn't have on my mac:
while read line; do echo $RANDOM $line; done < my_file | sort -n | cut -f2- -d' '
This iterates over all the lines in my_file and reprints them in a randomized order.
You should use shuf command =)
cat file1 file2 | shuf
Or with Perl :
cat file1 file2 | perl -MList::Util=shuffle -wne 'print shuffle <>;'
Just a note to OS X users who use MacPorts: the shuf command is part of coreutils and is installed under name gshuf.
$ sudo port install coreutils
$ gshuf example.txt # or cat example.txt | gshuf
You don't need to use pipes here. Sort alone does this with the file(s) as parameters. I would just do
sort -R file1
or if you have multiple files
sort -R file1 file2
I would use shuf too.
another option, gnu sort has:
-R, --random-sort
sort by random hash of keys
you could try:
cat file1 file2|sort -R
Sort: (similar lines will be put together)
cat file1 file2 | sort -R
Shuf:
cat file1 file2 | shuf
Perl:
cat file1 file2 | perl -MList::Util=shuffle -e 'print shuffle<STDIN>'
BASH:
cat file1 file2 | while IFS= read -r line
do
printf "%06d %s\n" $RANDOM "$line"
done | sort -n | cut -c8-
Awk:
cat file1 file2 | awk 'BEGIN{srand()}{printf "%06d %s\n", rand()*1000000, $0;}' | sort -n | cut -c8-