I want to block certain type of pages that are generated dynamically through robots.txt. Format for such pages is: domain.com/page.php?variable1=value or domain.com/page.php?variable2=value or domain.com/page.php?variable3=value If i add: Disallow: /page.php?* would it work?Please suggest.
Disallow: /page.php?* will block all urls like: domain.com/page.php?variable1=value domain.com/page.php?other-words.php You can check and test your existing robots.txt via Google Webmaster Tools > Site Configuration > Crawler access. Also, make sure that your robots.txt file is valid: http://tool.motoricerca.info/robots-checker.phtml
Is this to avoid duplicate content (for example, ?variable1=value changes the order a product list, but still all the same products are on the page)? If so, you can use rel=canonical so that Google knows example.com/product-list.php and example.com/productlist.php?sort=asc are the same. More information.