I don't know what is the problem. i have 2 sites which google only index the homepage, but i have more than one page on that site. The problem is that when I search that domain in google the result is only the domain name(without any description or any other page from that site). On google webmaster tools i see this when I try to analyze robots.txt: Home page access Googlebot is blocked from http://....
If your robots.txt is blocking googlebot from your pages then just edit it to what you want it to block/unblock. Look in the public domain folder of your website and look for robots.txt, then download and edit it. Or, if you want it to allow access to all pages just create a new robots.txt which allows everything. It would look like this:
i tried it but that isn't the solution...this google is insane.When i check the robots.txt on webmaster tools in analyze robots it says like this; URL Googlebot cheaprctoys.info/ Blocked by line 2: Disallow: / Detected as a directory; specific files may have different restrictions
In your www.NinosWebJourney.com/robots.txt file, you have placed the following: User-agent: * Disallow: This will block everything! If you want the SEs to index your site, remove the "Disallow:" line from the robots.txt file: User-agent: * Also, go to www.robotstxt.org/ for the complete commands, examples and command syntax.