robots.txt : how to disallow subfolders of dynamic folder
I have urls like these: /products/:product_id/deals/new /products/:product_id/deals/index I'd like to disallow the "deals" folder in my robots.txt file. [Edit] I'd like to disallow this folder for Google, Yahoo and Bing Bots. Does anyone know if these bots support wildcard character and so would support the following rule? Disallow: /products/*/deals Also... Do you have any really good tuto on robots.txt rules? As I didn't manage to find a "really" good one I could use one... And one last question: Is the robots.txt the best way to handle this? Or should I better use the "noindex" meta? Thx