How to download links in the html of a url? [closed]

杀马特。学长 韩版系。学妹 提交于 2020-01-03 05:38:11

问题


For example, when I open https://stackoverflow.com/ in browser, the browser will download not only the main page, but also images, js, css.

But when I do curl https://stackoverflow.com/, only the main page html is downloaded. Is there any options of curl or wget that can download images/js/css also?

Or any other tools can do this?


回答1:


wget -r will save everything

wget -r www.your-site.com


来源:https://stackoverflow.com/questions/48000468/how-to-download-links-in-the-html-of-a-url

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!