Hi I recently switched my my blog to my new domain And added it in google webmaster Found that more than 150 pages were not indexed or failed to indexed And checked robot.txt http://www.blackjackbox.info/robots.txt -------------------------------------------------------------------------- User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Sitemap: http://www.blackjackbox.info/feeds/posts/default?orderby=updated -------------------------------------------------------------------------- Found the following can i edit this file to allow index the restricted files please help
1. step validate your robots.txt using your Google webmaster account see http://www.google.com/support/webmasters/bin/answer.py?answer=35237 2. I think your one line User-agent: Mediapartners-Google Disallow: does require a / hence User-agent: Mediapartners-Google Disallow: / besides that you exclude nothing - except the folder /search which appears to be identical/duplicate to your domain root content. actually i see ony ONE page you have all other internal links appear to go into your /search and seem to have all identical content to allow Google to index all those identical pages may result in a penalty for duplicate content .... may !