What options are there to detect web-crawlers that do not want to be detected?
(I know that listing detection techniques will allow the smart stealth-crawle
An easy solution is to create a link and make it invisible
Of course you should expect that some people who look at the source code follow that link just to see where it leads. But you could present those users with a captcha...
Valid crawlers would, of course, also follow the link. But you should not implement a rel=nofollow, but look for the sign of a valid crawler. (like the user agent)