Robots.txt for multiple domains

后端 未结 3 1898
迷失自我
迷失自我 2020-12-15 19:02

We have different domains for each language

  1. www.abc.com
  2. www.abc.se
  3. www.abc.de

And then we have different sitemap.xml for each

3条回答
  •  旧时难觅i
    2020-12-15 19:36

    Based on Hans2103's answer, I wrote this one that should be safe to be included in just about every web project:

    # URL Rewrite solution for robots.txt for multidomains on single docroot
    RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
    RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
    RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
    RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
    

    This rewrite condition should just serve the normal robots.txt if it's present and only look for a robots/ directory with the specified file robots/.txt.

    N.B.: The above rewrite has not yet been tested. Feel free to correct me if you find any flaws; I will update this post for future reference upon any helpful corrective comments.

提交回复
热议问题