I would like to create a robots.txt for my wordpress blog. This robots.txt is created for block all bots from access some specific pages/link/directories/files but allow them to index all the rest content. I'm afraid if I would done wrong so I need your advice here first. Is this robots.txt file correct? User-Agent: * Allow: / Disallow: /page1.html Disallow: /page2.html Disallow: /pop.php Code (markup): - Should I add "Allow: /" ? or just use Disallow specific page. - pop.php has dynamic strings such as ?m=xx (pop.php?m=xx). I would like to block them all. Is it ok to block just pop.php or I have to put more specific such as "Disallow: /pop.php?m=xx"? - If I want to block specific Googlebot, Which name is correct one? Googlebot, GoogleBot? Thank you in advance.