Hello, my indexes are lost. They were about 450,000 and now they are 65,000 Anybody having the same issue ?
It happens with big sites sometimes, although you may also not be looking at accurate data, just give it time, be patient, it will come back...
This does happen often in Google. I have seen it many times over the years. Then only thing that seems to help is to build quality links and if you have a large website make sure some of the link building is to the inner pages as well.
meannn, that happened with many sites this spring, I would recommend the set of the following steps: 1. Check your robots.txt, if you accidentally disallow some directories on your website. 2. Check your page for presence of robots=noindex meta tag. Some seo-packs for CMS engines have this option enabled by default. Use it wisely. 3. Check your internal linking structure. It might be the case that some of your internal links to the problematic/disappeared webpages get broken (use Google Webmaster Tools for that). 4. Check your "description" meta tag on the problematic webpages. The text in this tag should vary from page to page. Google recently started throwing away pages with similar description fields (use Google Webmaster Tools to see pages with the same description field). 5. Check your pages for duplicate content. If you get a lot of duplicate or non-unique content, Google may apply site-wise penalty. 5.1. Some CMS engines have their internal by-default pages (forum, blogs, news, etc) which are not unique net-wise. If they accidentally get into index, Google might penalize the whole website to certain extend. 6. Improve your internal linking structure. The best approach currently are not sitemaps or extensive menu, but interlinking in the style of Wikipedia - the internal links are inside the content. Google love it! 7. Write and add some new and unique content for the problematic directories. 8. Get some backlinks to these problematic directories from trustworthy websites. 9. Finally, validate your html-code for serious bugs/mismatches of tags and for compliance with the standards.
google do this thing many times in the last months and then get page index again and i can't understand waht is this
Try to submit xml sitemap on google webmaster tools or submit your site on stumbleupon to get reindex.
I think this is a normal issue on Google. I have also submit my sitemap in xml via Google webmaster center but also often notice a indexing page fluctuation via my seoquake toolbar, especailly when I don't update my sites or don't add new pages for a period of time. Thanks,
this is think is temporary as results are redirected to some other data center wait for sometime and probably should be back
You will see unstable indexed pages indeed. You'll see also unstable traffic. This is because of the latest update of Google (Since day 1 on this month), Called "Google Mayday", I am suprised no one heard about it on DP The update is still in progress, Some peoples saying Google is broke and showing weird results because of that (Which get less traffic to quality sites), But we'll have to sit and waite
The exact same thing happened to me about 2 weeks back. Here's what people suggested i should do: http://forums.digitalpoint.com/showthread.php?t=1804014
Thanks for suggestion ;-) I will will write some memobook when I will be shitty old, probably. For now, I have much more interesting stuff to propose to SEO people.