Robots.txt for multiple domains

后端 未结 3 1892
迷失自我
迷失自我 2020-12-15 19:02

We have different domains for each language

  1. www.abc.com
  2. www.abc.se
  3. www.abc.de

And then we have different sitemap.xml for each

3条回答
  •  野趣味
    野趣味 (楼主)
    2020-12-15 19:41

    The robots.txt can only inform the search engines of sitemaps for its own domain. So that one will be the only one it honors when it crawls that domain's robots.txt. If all three domains map to the same website and share a robots.txt then the search engines will effectively find each sitemap.

提交回复
热议问题