Search engines ARE screen scrapers by definition. So most things you do to make it harder to screen scrape will also make it harder to index your content.
Well behaved robots will honour your robots.txt file.
You could also block the IP of known offenders or add obfuscating HTML tags into your content when it's not sent to a known good robot. It's a losing battle though. I recommend the litigation route for known offenders.
You could also hide identifying data in the content to make it easier to track down offenders. Encyclopaedias have been known to to add Fictitious entries to help detect and prosecute copyright infringers.