You can somehow reduce your site being listed using a robots.txt. Note that this depends on the "goodwill" of the crawler, though (some spambots will explicitly look at locations that you disallow).
The only safe and reliable way of not having a site listed, sadly, is not putting it on the internet.
Simply not linking to your site will not work. Crawlers get their info from many sources, including browser referrers and domain registrars. So, in order to be "invisible", you would have to not visit your site and not register a domain (only access it via IP address).
And then, if you run your webserver based on IP address, you still have all the spambots probing random addresses. It will take a while, but they will find you.
Password protecting your site should work, effectively making it inaccessible. Though (and it is beyond my comprehension how that happens) for example there are literally thousands of ACM papers listed in Google which you cannot see without an account and logging in. Yet they are there.