问题
I write 10 domains in test.csv and trying to get the header of these 10 domains. but it won't read contents 1 by 1.
and run this script
for j in test.csv
do
awk -F',' '{ print "$1" }' $j | curl -Is | cat >> b.txt
done
I have about 10 million domains, and trying to get header by script. Any way possible ?
回答1:
curl won't read URLs from stdin unless you tell it to do so. And you don't need a loop for this (assuming you have multiple files). What you're looking for is:
awk -F',' '{ print "url=" $1 }' file1 file2 file3 ... | curl -s -I -K- > out
Notice the -K-, it tells curl to read command line arguments from stdin.
来源:https://stackoverflow.com/questions/59234715/get-header-of-domains-by-bash-script