Can I use WGET to generate a sitemap of a website given its URL?

后端 未结 2 1161
长发绾君心
长发绾君心 2021-02-03 10:42

I need a script that can spider a website and return the list of all crawled pages in plain-text or similar format; which I will submit to search engines as sitemap. Can I use W

相关标签:
2条回答
  • 2021-02-03 11:14
    wget --spider --recursive --no-verbose --output-file=wgetlog.txt http://somewebsite.com
    sed -n "s@.\+ URL:\([^ ]\+\) .\+@\1@p" wgetlog.txt | sed "s@&@\&@" > sedlog.txt
    

    This creates a file called sedlog.txt that contains all links found on the specified website. You can use PHP or a shell script to convert the text file sitemap into an XML sitemap. Tweak the parameters of the wget command (accept/reject/include/exclude) to get only the links you need.

    0 讨论(0)
  • 2021-02-03 11:19

    You can use this perl script to do the trick : http://code.google.com/p/perlsitemapgenerator/

    0 讨论(0)
提交回复
热议问题