Using site: command 2 days ago: about 11,000 pages Yesterday: about 15,000 pages Today: About 4,500 pages WTF???
He's not talking about visitors to his site. He's talking about the number of pages from his site that Google claims is in the index. site:URL -> returns number of pages in index Apparently, it's jumping up and down from day to day.
Yes, that's what I meant... I can only wish I had 15,000 visitors How the heck can one domain show 15,000 indexed pages one day, and 4,500 the next day? Stupid Google...
I guess it depends on what is being "surfed." I've always seen the most traffic and largest revenue on the weekends.. That and Mondays.. Seems many tend to surf more than work on Monday
Seeing the same thing, but I think mine is due to a penalty thanks to a stupid 'google is evil' buddy icon lol. :x Going from 11,500 to 4,000 and back daily..
mine is going up already... decreased down to ~30% (!!!) last week, recovered to 90% of my old cached pages yesterday. I really hate it but that's what you get from google for not starting a site in 1996 but 2004.
I'm getting crawled (over crawled?) but no pages are being indexed nor in serps like in the good days last year.
Looks like google is up to something, my sites are seeing the same thing, maybe a pr update? its probably results from different data centers if I had my guess
I have been watching this for a few weeks now. Seems like they are almost reindexing every site in their index and then passing them through some new algo feature (which appears to remove a lot of pages based on not enough unique content). A couple of sites were bouncing around like this for a week or two, then G started to crawl them again and about a week or two later they came back and are now stable. Now I am seeing a couple of other sites show the same behavior. Some sites with pages that have a fair amount of similarity have had a bunch of pages dropped from the index. IMHO it looks like G is getting a little more strict on the amount/quality of the content on pages in their index (probably in an effort to reduce SE spam, etc). NOTE: This is my opinion and does not reflect the opinions of this site or anyone else in the world.
Fryman looks like you're getting data from different datacenters. you can use this tool and see how the data differentiates on your site throughout the different datacenters. goya-rank.com/blingo.php
Whatever Google is doing, I don't think it has anything to do with insufficient unique content. Whatever is happening is happening to forum pages too, or maybe especially.
When I started this thread there were 4,500 pages. Right now there are 14,900... and tomorrow, who knows?...
I've noticed the same thing. I wish I had 15000 pages, but that will happen in due time. I've noticed that when gbot is crawling the numbers will go up and down by as much as 10 - 15% or more sometimes from hour to hour. If I wait 24 hours after the crawl is finished I find I almost always have more pages than the last time. Lately thought it's been a bit strange, for the past couple of weeks I haven't had 24 hours between crawls so it's hard to get a good idea of what's really going on there.