Ignore urls in robot.txt with specific parameters?
I would like for google to ignore urls like this: http://www.mydomain.com/new-printers?dir=asc&order=price&p=3 All urls that have the parameters dir, order and price should be ignored but I dont have experience with Robots.txt. Any idea? Here's a solutions if you want to disallow query strings: Disallow: /*?* or if you want to be more precise on your query string: Disallow: /*?dir=*&order=*&p=* You can also add to the robots.txt which url to allow Allow: /new-printer$ The $ will make sure only the /new-printer will be allowed. More info: http://code.google.com/web/controlcrawlindex/docs/robots