Hello I am trying to set up my robots.txt to block certain pages. I want to block URLs like the following: http://www.lawltech.com/cooking/infrared-grills-grill-your-meat-with-laser-guided-precision.php?Itemid=1 http://www.lawltech.com/random-fun-stuff/have-you-ever-tried-putting-a-tiki-torch-in-a-blender.php?Itemid=1 http://www.lawltech.com/computer-toys/from-85-degrees-to-47-degrees-in-5-minutes-the-usb-fridge.php?Itemid=1 HTML: But I would like to allow URLs like the following: http://www.lawltech.com/cooking/infrared-grills-grill-your-meat-with-laser-guided-precision.php?Itemid=74 http://www.lawltech.com/random-fun-stuff/have-you-ever-tried-putting-a-tiki-torch-in-a-blender.php?Itemid=73 http://www.lawltech.com/computer-toys/from-85-degrees-to-47-degrees-in-5-minutes-the-usb-fridge.php?Itemid=69 HTML: As you can see these URLs are nearly identical. The only difference is the Intem ID at the end. I've added the following to my robots.txt file: User-agent: * Disallow: *?Itemid=1 Will this do the trick? Will it block out all pages with Item ID 1 from being indexed? Thank you.
I would say its: User-agent: * Disallow: /*?Itemid=1$ the $ marks the end of the url, otherwise you block ?Itemid=10, ?Itemid=11, .. aswell