www.robotstxt.org has the information you need Basically though you just type the following into a text editor and save it as robots.txt prior to uploading to the root directory of your site: User-agent: * Disallow: /cgi-bin sitemap: http://www.yourdomain.com/sitemap.xml Code (markup): replace yourdomain with (your domain) and make sure you have a sitemap.xml file. As for other pages and directories you want to block, just add them as needed, like Disallow: /images/