Ignore urls in robot.txt with specific parameters?

前端 未结 3 1205
误落风尘
误落风尘 2020-12-02 16:41

I would like for google to ignore urls like this:

http://www.mydomain.com/new-printers?dir=asc&order=price&p=3

All urls that have the parameters dir,

3条回答
  •  悲&欢浪女
    2020-12-02 17:07

    Here's a solutions if you want to disallow query strings:

    Disallow: /*?*
    

    or if you want to be more precise on your query string:

    Disallow: /*?dir=*&order=*&p=*
    

    You can also add to the robots.txt which url to allow

    Allow: /new-printer$
    

    The $ will make sure only the /new-printer will be allowed.

    More info:

    http://code.google.com/web/controlcrawlindex/docs/robots_txt.html

    http://sanzon.wordpress.com/2008/04/29/advanced-usage-of-robotstxt-w-querystrings/

提交回复
热议问题