The site in my signature has maintained its pages and new pages are indexed quite quickly, at least until now. I have one other site that has held on to all its 60 pages. My other 3 sites haven't been so lucky, and as mentioned above any pages still listed tend to be the least useful - copyright, out of date supplementals etc Wish I could identify the important factor.
Nintendo (who seems to have more sites than most of us have toothpicks) has indicated in a couple of other Coop related threads that he has several sites that increased although the majority of his sites decreased.
I hope you do too, as long as you come back here and tell the rest of us! One of two things is for certain, either the changes at Google are what they intended for them to be (which I doubt) or they are working to fix a problem with "BigDaddy" that they didn't discover until it rolled out all the way (most likely scenario). Either way, I don't see much information coming out of Google about this issue. They have historically been very tight lipped about anything going on inside their business and this situation will most likely be the same. In a few months these problems will probably clear up and Google will never say anything about what happened or even admit anything happened at all. There is an old saying that goes, “the only bad publicity is no publicity at all†and while scores of webmasters are angry with the Google results right now, the truth is that folks are talking and writing about Google daily which in a round about way is publicity. Just something to think about.
thanks for the good news, lol. allinurl: only shows half of the pages that the site: command does... and neither one of 'em are anywhere near correct.
Well, im starting to see a gradual increase in the amount of pages indexed, across all the datacentres, for all our sites (im using this tool: http://www.yourcache.com). Also, has anyone noticed a sudden surge in rankings? I would normally put it down to coming out of the sandox, but its hitting all of our sites, including those already out... Im seeing jumps of 200-300 positions for some keywords, sometimes even more (eg, not ranking in top 1000, then suddenly surging to 31 0.0; ) I just hope they all stay stable
....oooO............... .....(....)................ ......)../....Oooo..... .....(_/.....(....)....... ...............)../......... ..............(_/......... ........................... ....oooO............... .....(....)................ ......)../....Oooo..... .....(_/.....(....)....... ...............)../........ ..............(_/......... ........................... ....oooO............... .....(....)................ ......)../....Oooo..... .....(_/.....(....)....... ...............)../........ ..............(_/......... ........................... G is walking all over me, I lost 1/2 the pages I had listed today which was half what it should be before I lost the 1/2 today.
I recovered and had my site completely indexed again. And a few day's later they took away my most important content pages again. Caused me to drop 30 spots in the serps too. I wish they would make up their minds lol
I see another increase in pages indexed for the domain overall (not for the forum) but again these are OLD pages which no longer exist. Here's an example: That page hasn't existed in months - Look at the cache date! So again what I see is new pages - not in index old pages that don't exist and haven't existed for some time - lots in index, even if they are mostly supplemental Conclusion: Google is totally screwed up.
I am seeing the same on and off. depends which DC is used for google.com. Lots of July-August 2005 crap
You might take a look at this: http://72.14.203.104/search?q=cache...s/workshops.htm+site:www.psychlinks.ca/pages/ brings up your error page, error.php. But: http://72.14.203.104/search?q=cache:www.psychlinks.ca/pages/workshops.htm brings up nothing. Running http //www psychlinks ca/pages/workshops.htm through a good header checker yields 2 results, a 302 then a 404. We know Google has problems with 302's, you might try to configure things so that it sends back a 404 immediately if a page is no longer on your site. This will eliminate any confusion about whether or not the page exists, or if it is simply temporarily redirected and therefore the url should be reused in the future.
There is no 302 redirect. The reason you get those headers returned is because the page doesn't exist. The server than redirects (302) to the error page (404). That's another one of the pages I'm talking about that was deleted several months ago. This is old index data we're seeing. I can see in my logs that Google is back to crawling sites from some old data - page after page that hasn't existed since a redesign several months ago. Google had no trouble with any of that site redesign before Big Daddy. Since Big Daddy, what I see for the Psychlinks is new pages disappearing, old pages back in the index, and Googlebot using old data from somewhere trying to find nonexistent URLs. As someone, maybe Old Welsh Guy, suggested a while back. I do think part of the mess is Google's new bot caching or whatever Cutts called it.
If it doesn't exist, it should return HTTP Status Code 404, not a HTTP Status Code 302. Crawl Caching Proxy Servers. I think they may exaggerate what was a small problem, 302 redirect status code and Google indexing problems within a site.
Damned if I know. But I understood that if two headers are returned, the second should be obeyed. Also, until Big Daddy, Google wasn't having a problem with it - whatever they did in Big Daddy created the problem.
I think the "302" fix not only didn't work, but made the problem worse. More than likely "BigDaddy" was designed to address and fix these issues, but someone in the coding department decided to try and do even more and that is why there are problems now. This does not explain the cache date of Minstrel's results, because his pages are "PHP", but I have noticed on static pages that Google's cache date is the last time that page was modified and not the last time it was "polled" by Googlebot. If Googlebot receives a 304 "Page Not Changed" command, it does not update the cache at all unless the page actually did change. I think this is why some of these old cache dates are showing up, but like I said it doesn't explain Minstrel's situation because his pages are "PHP", not static.
I have a client that has 30K pages indexed in MSN and Yahoo and until recently was the same in Google. Pages are now down to 958 in Google and the site is in now way black hat. There is an obvious indexing problem with Google that is seperate from the site: problem. It would be nice if they would just be honest about stuff like this instead of wasting SO MUCH of everyone's time to protect their namesake. The truth of the matter is I would respect them more for admitting their mistakes rather than trying to follow in Microsofts footprints and tell you after they fix it. What happened to "Do No Evil"? Not telling the truth doesn't seem to fit into that phrase...lol.