I\'m trying to block all bots/crawlers/spiders for a special directory. How can I do that with htaccess? I searched a little bit and found a solution by blockin
Why use .htaccess or mod_rewrite for a job that is specifically meant for robots.txt? Here is the robots.txt snippet you will need t block a specific set of directories.
User-agent: *
Disallow: /subdir1/
Disallow: /subdir2/
Disallow: /subdir3/
This will block all search bots in directories /subdir1/, /subdir2/ and /subdir3/.
For more explanation see here: http://www.robotstxt.org/orig.html