RAMBLING INTRO: (you can probably skip to THE PROBLEM) Hopefully, someone here can help me. I have a new domain that is registered for three years. The site was at Godaddy and just moved to a dedicated server in hopes of solving this issue. For the first weeks of it's life, the domain had not robots.txt file. No big deal, robots should spider away with no file. However, Google has not spidered the site and it's not included in the Google index. The site is wordpress blog with fresh updated content, no spam, just a real honest blog. I'm not even running any adds. This is my first (or second) post on this site and I'm not up to date on the rules, so I won't post the URL unless someone needs it as I'm not sure if that is allowed. THE PROBLEM: Under Google Webmaster tools: Dashboard > Tools > Analyze robots.txt Reports this robots.txt file: User-agent: * Disallow: / Until today, I had no robots.txt file. I certainly would NEVER have one excluding all robots. Since they want to tell me the above is my robots file, I added a real one (below) User-agent: * Disallow: THE QUESTION: WTH (what the heck) is going on and why are they telling me that I've got a robots file excluding all spiders? I want to be friend with Google and have them add my site. Any help would be sincerely appreciated.
my robots.txt for my wordpress blogs goes like this: User-agent: * Disallow: /wp-includes/ Disallow: /wp-admin/ Disallow: /cgi-bin/ Disallow: /wp-content/cache/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Disallow: /wp-content/upgrade/ i have had no problem with Google for indexing my pages. one thing you can do, is to add google sitemap to ur site. there are some good plugins available. maybe u have some meta tags preventing google to index ur site...