You need to use the robots.txt as stated by seojig, I would also insert the robots meta tag into each web page to be removed as well. Below is an example of both. Insert the below code into your <head></head> section on each web page to be de-indexed. <meta name="robots" content="noindex"> Code (markup): Insert the below code into your robots.txt file, the 1st one is how to block a specific web page, the second is how to block an entire directory. User-agent: * Disallow: /file.html Code (markup): User-agent: * Disallow: /dir/ Code (markup):
Bear in mind though that with ssandecki's approach you'd have to list the bot (such as Googlebot for Google) that you want to block, otherwise EVERY SEARCH ENGINE SPIDER will not crawl your site.