I have many .sh
scripts in a single folder and would like to run them one after another. A single script can be executed as:
bash wget-some_long_
I ran into this problem where I couldn't use loops and run-parts works with cron.
foo () {
bash -H $1
#echo $1
#cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script
You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.