I\'m looking into building a content site with possibly thousands of different entries, accessible by index and by search.
What are the measures I can take to preven
The only way to stop a site being machine ripped is to make the user prove that they are human.
You could make users perform a task that is easy for humans and hard for machines, eg: CAPTCHA. When a user first gets to your site present a CAPTCHA and only allow them to proceed once it has completed. If the user starts moving from page to page too quickly re-verify.
This is not 100% effective and hackers are always trying to break them.
Alternatively you could make slow responses. You don't need to make them crawl, but pick a speed that is reasonable for humans (this would be very slow for a machine). This just makes them take longer to scrape your site, but not impossible.
OK. Out of ideas.