Can anybody please explain the correct robots.txt command for the following scenario.
robots.txt
I would like to allow access to:
/directory/subdirec
Be aware that there is no real official standard and that any web crawler may happily ignore your robots.txt
According to a Google groups post, the following works at least with GoogleBot;
User-agent: Googlebot Disallow: /directory/ Allow: /directory/subdirectory/