I had +3k pages and 1500 was index by Google but today i find in google webmaster tool that only 4 pages are index, four days ago i active robots meta plugin and add robots.txt file but forget to allowing google bot. I do something wrong or Google is going crazy for my blogs
If you are using wordpress blog by default there is already robots.txt file you can view it at http://www.example.com/robots.txt , You can check the exact index pages via site:http://www.example.com it will show exact number of index pages in Google.
Google is often wrong about the stats they give out, that may or may not effect SERPs. I have seen a few sites that Google gives really weird stats for in webmaster tools and often that information conflicts with other information Google gives out in other places. I have one site that has pr on hundreds of pages through out the site, thousands of links incoming according to Yahoo, if you go to webmaster tools it says the site has a couple hundreds of links but if you do a "link:" search in Google there is one link listed and its coming from a page within the site. Kinda silly. My point is as long as your position in the SERP's has not gone down too, dont worry too much about seeing strange things in your stats.
You said it and this is why. Fix your robots.txt to allow Googlebot and G will slowly re-index your pages.
Just give it some and the pages will be backed. tuneup the robots.txt and keep pinging google in a smart interval with your sitemap.
Those stats have to be wrong! I would put a ticket into Google and find out. Make sure your sitemap is clean.
Well, similar to the OP happened to my site. Within one week the number of indexed pages went down from 1890 to 890. I did not make any changes on my robots.txt and was actually very pleased the way Google indexed more and more of my pages. Also today I recognized in webmastertools Google does not list any of my keywords anymore after it just updated the significents of my main keywords a couple weeks ago. Not one keyword is listed. But this could be a temporary glitch in their system and I will just wait a few days and see what's gonna happen. I don't know what's going on with Google but I surely don't like all those changes. Traffic I was getting from Google went steadily down since January, since last PR update, to about 30 a day now. Not even me to talk about SERP's. The number of indexed backlinks decreased by about 200 but went up about 100 again since last week. Google's speed of finding and indexing backlinks is anyway a subject of it's own. Even Bing finds them faster and more of my backlinks than Google does. There is only a slightly difference of about 700 backlinks that yahoo shows and what Google shows. 700 backlinks for an only about 6 months old site can make a big difference in SERP's I would think. But I don't wanna bitch too much about it because when we started building backlinks we didn't pay too much attention the nofollow tag. At that time I didn't have a clue about certain tools and practices. But we are learning out of mistakes and some tricks. I'm just wondering if all this up and down is caused by Google rolling out Caffein and the changes they made in their search algorithm. Cheers verdecove
and how can you tune up a robots.txt or ping google? yahoo slurp and google bot are crawling my pages every single day. plus all the other small creatures of bots and spiders they do nothing but taking bandwidth.
One of my sites dropped from 2170 indexed pages to 92 overnight this week after I did a prefered domain choice and a 301 redirect because of canonization with www. and non-www. files both being indexed on G. I read that both the www. and non-www. files would remain in the index until all the re-indexing was done but this does not appear to be true.. I dont know if you did the same but I feel gutted about my loss and have fingers crossed that the pages will reappear soon and I dont lose any PR when we all dance at the end of the month..
I've experienced the same thing in the past.. Best of luck to you.... When this happened to me it all came back.. although it was very slow.
I undo what i did but still 4 pages index. Here is my old robot.txt file code i do not know how to allow google, msn, yahoo, altavista and other big names in one line code: according to google webmaste tool 100 posts are blocked by robots file and here is a code of robot file, while i deleted it yesterday and i cannot see in ftp but it looks file is hidden. Please help me in generating robots file where i can allow all bots to crawl my website
If you share your blog I will have a look at it then I can tell you what is wrong with it and It does not block Google from crawling.
Google deindex the webpages of a site for a reason. Try contacting the google directly and ask them what is the problem with your site. Make sure that you are not using any Blackhat SEO.
Its very rare, never came across this problem yet in any of my blogs or sites. Will be in touch with this thread for a solution