Yes, write a robots.txt file, it will prevent them from indexing the desired pages ar the whole site!
Create a robots.txt file with the following and place the file on root of the server like yourdomain.com/robots.txt User-agent: * Disallow: / or use this meta tag <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
well if u really dont want anything to get indexed, please password protect ur website, cos people can still manually scrape ur content and use it. also there are many SE's, we dont know if all of them recognize "robots.txt", google might find ur content somewhere else and still index it.
if I setup robotx.txt will it block my domain as well? I mean indexing site is one thing and only domain is another...
May I know are you intent to block out your whole site? You can use robots.txt and configure to block out some directories too.
Make your robots.txt with the following text & save it to your root of the domain... cheers Regards, SwapsRulez
All webmaster love to see their site/pages indexed at SE's, can we know the reason why you are trying to block SE for getting into your site