In Linux how can I fetch an URL and get its contents in a variable in shell script?
There are many ways to get a page from the command line... but it also depends if you want the code source or the page itself:
If you need the code source:
with curl:
curl $url
with wget:
wget -O - $url
but if you want to get what you can see with a browser, lynx can be useful:
lynx -dump $url
I think you can find so many solutions for this little problem, maybe you should read all man pages for those commands. And don't forget to replace $url
by your URL :)
Good luck :)