Robots.txt for multiple domains

后端 未结 3 1891
迷失自我
迷失自我 2020-12-15 19:02

We have different domains for each language

  1. www.abc.com
  2. www.abc.se
  3. www.abc.de

And then we have different sitemap.xml for each

相关标签:
3条回答
  • 2020-12-15 19:36

    Based on Hans2103's answer, I wrote this one that should be safe to be included in just about every web project:

    # URL Rewrite solution for robots.txt for multidomains on single docroot
    RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
    RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
    RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
    RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
    

    This rewrite condition should just serve the normal robots.txt if it's present and only look for a robots/ directory with the specified file robots/<domain.tld>.txt.

    N.B.: The above rewrite has not yet been tested. Feel free to correct me if you find any flaws; I will update this post for future reference upon any helpful corrective comments.

    0 讨论(0)
  • 2020-12-15 19:37

    I'm using the following solution in .htaccess after all domain redirects and www to non-www redirection.

    # Rewrite URL for robots.txt
    RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
    

    Create a new directory in your root called robots. Create a text file filled with the specific robots information for every domain.

    • /robots/abc.com.txt
    • /robots/abc.se.txt
    • /robots/abc.de.txt
    0 讨论(0)
  • 2020-12-15 19:41

    The robots.txt can only inform the search engines of sitemaps for its own domain. So that one will be the only one it honors when it crawls that domain's robots.txt. If all three domains map to the same website and share a robots.txt then the search engines will effectively find each sitemap.

    0 讨论(0)
提交回复
热议问题