The first thing on the list is to try and figure out why. Granted, Google is de-indexing a lot of fine pages but this is not the case for all of them across the board. A good place to start is with Google sitemaps. If Google has a "problem" with your site you can find that out however, they won't give the problem specifically. Dave
Nop, I use google sitemap and don't see anything wrong there. Only thing different is the indexing is successful. Usually it is partial indexing. How would you find out what is the deindexing problem by using google sitemap?
You cannot find out the specific problem, just if there is one. Next, use the site: operator to find out if you have a canonical or duplicate problem with any/all of the following being indexed... http://mysite.com http://mysite.com/index http://www.mysite.com http://www.mysite.com/index Then try using some like copyscape and the link: operator to look for duplicate content. Dave
hell,, it just reminded me that I have redirected the traffic the old blog to the new blog by .htaccess file. The problem is I can't redirect all of them so some of them redirect to the same page. it might count as a duplicate content. ;( I just have them removed now and will accept to have 404 ;(
I think probably nothing helps except waiting, but two suggestions I have implemented and may have improved my position are: use .htaccess to redirect http://site. com to http://www.site. com (as above) and put a sitemap on the site itself, one click away from the home page, so no page on the site is more than 2 clicks deep. Google sitemaps don't appear to help but a 'site' sitemap just might.
well, i don't use above because I have never link to myself without www. and it will affect the other part of .htaccess code. also my back link seem link to only www.
This is fine, but it does not prevent anyone else from linking to you without the www Google, appears to have a handle on this but when it comes to them, it's a good idea to hedge your bet. Dave