Hi Guys. I want to try and ban a load of bad bots using 2 ways. .htaccess and robots.txt I have cracked the .htaccess file. I now need to complete the robots.txt file ... now ... i always thought that in a robots txt file you needed to tell the bots to go and access everything (or what you wanted them to access) and then individually block the bad bots. So ... User-agent: * Disallow: Says everyone can spider ... And ... User-agent: EmailWolf Disallow: / Will block agent 'EmailWolf' ??? When i look at the robots file at seobook.com/robots.txt There seems to be NO instruction to say everyone spider my site apart from .... Do i need User-agent: * Disallow: Or should i copy seobook.com/robots.txt and just block everything that is listed????