Script to get the HTTP status code of a list of urls?

前端 未结 8 871
无人及你
无人及你 2020-11-30 17:17

I have a list of URLS that I need to check, to see if they still work or not. I would like to write a bash script that does that for me.

I only need the returned HTT

8条回答
  •  抹茶落季
    2020-11-30 17:58

    This relies on widely available wget, present almost everywhere, even on Alpine Linux.

    wget --server-response --spider --quiet "${url}" 2>&1 | awk 'NR==1{print $2}'
    

    The explanations are as follow :

    --quiet

    Turn off Wget's output.

    Source - wget man pages

    --spider

    [ ... ] it will not download the pages, just check that they are there. [ ... ]

    Source - wget man pages

    --server-response

    Print the headers sent by HTTP servers and responses sent by FTP servers.

    Source - wget man pages

    What they don't say about --server-response is that those headers output are printed to standard error (sterr), thus the need to redirect to stdin.

    The output sent to standard input, we can pipe it to awk to extract the HTTP status code. That code is :

    • the second ($2) non-blank group of characters: {$2}
    • on the very first line of the header: NR==1

    And because we want to print it... {print $2}.

    wget --server-response --spider --quiet "${url}" 2>&1 | awk 'NR==1{print $2}'
    

提交回复
热议问题