Are you updating your sitemap with Google Sitemaps? Do these urls come from old 302 redirects, but are now 404s? Google will save and revisit the url if it gets a 302 status code. It considers it temporary. The way they cache the content on the other side of the 302 status code redirect is where the hijack and duplicate content penalties kick in. But they now are getting 404 status codes, and should not get followed in the future. Sitemaps should help with that. The double slash urls, they can be from bad rewrites, improperly linking, etc. Some of those could come from the proxy cache servers, they get fed data from other Google services. Someone types in the double slash, get the 302 redirect and error page with any google service (like site search, or perhaps just the toolbar...) it is fed into the cache. Googlebot visits it, gets the 302 and error page's content. I wonder how many more sites with the same problems are suffering from the same thing.
Friend's website was indexed in google, very well too, some result made him first page and some even top. But now, his/her website are nowhere can be found in google unless you type "site:www.abc.com" but sometimes work, sometimes don't. What happened to the G server indexed page? What is happening?
The crazy result continues but some improvement. The site command was only showing the root URL and the rest was old pages that were listed as supplement. Yesterday, some DC started to show increased number of pages and today site commands returns the root plus additional 3 new pages and the rest is the old pages as supplemental result.
Finally, the truth is out there. It's true that google has problems and we can see it with many members here.
Yeah. True. With that, it is pity of my friend that doesn't have any visitor from googles. Hope the DC will recover soon, and hope my friend will stay happy. Cheers. Tomorrow will be a better day.
Great thread I've read through it but I couldn't see if anyone commented that even though site: operator isn't working, that it's not affected their traffic to the website? I'm seeing fluctuations in site: from day to day, but traffic doesn't appear to have been affected. My newer websites I am seeing the correct number of pages - my older websites are the complete opposite.
Unfortunately, I think that many webmasters ARE seeing a change in traffic. Google has always been my #1 referrer. Since Big Daddy, they are slipping badly in that regard.
Hard to say. Is it the site: query that's broken or the index? If the former, was it intentional, just as they broke the links: query?
My traffic has been affected , now its 0 . Even my url has been removed from google's index . I emailed google they sent an automated response .
The site command still returns 234,000 while different DC return 27,600 to 41,500 as the number of indexed. Today, it shows the root and 4 real pages and the rest are still supplemental result with old non-existence pages. This is 1 real page more than yesterday, is Google really planning to add real indexed pages at 1 a day?
my friends website still having problem with G. Was told that it has at least 100+ indexed but now left to 13+. thats terrible news for my friend.
Hard to count, even calculating it will be a problem. With G, i think most people are really counting on it. Imaging if all website doesn't G now. . . . uncountable.
So just an example site. Before latest deindex issues, ~12k pages During deindex, down to 80 pages last week back to 12k pages (newly cached) today, 800 pages go figure..
my traffic tanked too. now that site: is "working again" on my site, my traffic is starting to return. so no, it is NOT just a site: bug. Traffic IS being affected.
Another good day in Google land. The number of new pages increased from 4 to 13, it still shows 234,000 results but the rest are old pages listed as supplemental. DC centers show 41,000.