I need to detect scraping of info on my website. I tried detection based on behavior patterns, and it seems to be promising, although relatively computing heavy.
The
To be honest your approach is completely worthless because its trivial bypass. An attacker doesn't even have to write a line of code to bypass it. Proxy servers are free and you can boot up a new machine with a new ip address on amazon ec2 for 2 cents an hour.
A better approach is Roboo which uses cookie techniques to foil robots. The vast majority of robots can't run javascript or flash, and this can be used to your advantage.
However all of this "(in)security though obscurity", and the ONLY REASON why it might work is because your data isn't worth a programmer spending 5 minutes on it. (Roboo included)