Robots.txt for multiple domains

后端 未结 3 1890
迷失自我
迷失自我 2020-12-15 19:02

We have different domains for each language

  1. www.abc.com
  2. www.abc.se
  3. www.abc.de

And then we have different sitemap.xml for each

3条回答
  •  失恋的感觉
    2020-12-15 19:37

    I'm using the following solution in .htaccess after all domain redirects and www to non-www redirection.

    # Rewrite URL for robots.txt
    RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
    

    Create a new directory in your root called robots. Create a text file filled with the specific robots information for every domain.

    • /robots/abc.com.txt
    • /robots/abc.se.txt
    • /robots/abc.de.txt

提交回复
热议问题