Ever notice that what Matt Cutts calls Google’s “everflux†seems to have a distinct pattern? If we are to expect to see small, continual changes in the SERPs every day, one would expect them to be somewhat random — that is, some SERPs up, some down. But in the 61 keyword phrases that I am tracking, I’ve noticed that on any given day most of them will be “in the red†on the tracker. The next day most of them are “in the greenâ€. There seems to be a pattern of domain-wide SERPs going up (or down) in concert with one another. No big deal, but it makes me curious, wondering what there is about Google’s algorithm that seems to downgrade you site-wide one day, then upgrade you site-wide the next day or so, following that same pattern over and over. Is anyone else noticing the same pattern in your SERPs?
Been noticing that for at least a month or so now. Also have noticed that my sups pages come & go & only seem to rank when they are gone. (Wish they would go away permanently!!) I have also noticed (and posted a question about this earlier) that the pages indexed when I query link:www.domain.com do not always show the same when I go through my sitemaps account and check the indexed pages through that way. I am sooooo very tired (as I am sure everyone else is too) that G is being so radically random on a daily, if not hourly, basis. I just don't get it..........
serps sometimes even change when you refresh your browser. back and forward. I think it was about a week ago when I noticed this. I kept jumping from position 17 to 53 Everytime I refreshed it changed. The pattern I see is that at night alot of DC's change to another algo (?) and then they change it back during the day
Here’s something I have noticed, but can’t prove yet. When someone visits my site from Google and I visit the exact phrase they searched to get there I many times notice that the result isn’t on the same page it was on when the visitor made the search. Sometimes it’s a page further from the top of the results and sometimes it’s a page closer to the top. I have noticed that recently if a person visits my site and stays more than one minute or visits other pages while there, my listing moves up dramatically in the results within minutes. However, if they only visit for a few seconds and return to Google, then that listing drops lower just as fast. Could it be that Google is determining how relevant a page is by how long a consumer stays on it after they click from their SE? Just something I’ve noticed, but as I said can’t prove this as of yet with any hard evidence.
Can you run some tests on this? - This is something I believed Google would naturally move towards and end up implimenting as it makes total sense from a SERP perspective. Naturally if someone enters a keyword and hangs around on the site, it is a clear indication that the results was what the user wanted (so should go up in the SERPS) - and if they don't hang aroung, then it is not what the user wanted and they should go down in the SERPs. Its a far more accurate judge of relevance than relying on backlinks etc. This is the first time I have heard anyone say that they have seen this in action, so very interesting. It should finally help users who provide good content over those that just look for blackhat methods to improve SERPS.
Is there any chance that the SERPS are remaining pretty stable but the tool that you check with is fluctuating between datacentres?
I have to disagree with you because if there was something like this determining the serps, all that would have to happen is someone create a script, put all their competitors websites in it and have a bot visit from multiple IPs for 10 to 15 secs and leave. I dont think Google is that dumb but who knows maybe they are. I do believe that time spent on a site is important and has a factor in the algo but not to the extent you might be suggesting. Right now I think there is flux, results going from one Dc to the next. It has been suggested that Google is trying to impliment a new update to BD so this might be causing some unstable results. Just my 2 cents.
I didn't test it for just one site. But I test it for different websites and I don't see any such pattern. May be it is only valid for one website. [offtopic: thats my 1000th post]
It's a little difficult to test with so many different things entered into a search. However, I do agree with you that this does make sense from a Google point of view and it might explain why results are the way they are today. When results are based on users experiences over a one-minute period of time on a certain keyword entry, it would make sense why results are always changing depending on what the user searches for on Google. With "tracking cookies" and “IP tracing†it would also make it very difficult for someone to game the system without a huge amount of resources and IP addresses to draw upon. My opinion is that Google has moved way beyond PR and TR and they are looking at the overall user experience by the amount of time they spend on the results they offer for any keyword phrase. If I’m right about this, which is not something I’m willing to say as of yet. Google has found a way to move search results on a minute by minute basis using a time tested formula. When people like something they will stick around, but if they don’t like it, they will go away. Like I said, I can’t prove this and if they are using some kind of new search formula like this, then they are way ahead of anyone else in the search business. Like you said, it does make sense, but it must be a huge drain on resources to make it work in real time with millions of search queries every second. That would not take long for them to figure out because people would over do it, eventually, when they sent in a bot to enter a keyword phrase which only got an average of 10 or 20 searches per day and hit that same phrase 100 or 200 times. These kinds of things would be thrown out and the sytem would still work just fine. There are millions of keyword phrases search for each day and when something doesn't look right the whole thing gets thrown out as SPAM. I'm not saying this is what is happening, but it does explain this hour by hour and minute by minute "everflux" better than anything else I've seen so far. I think seach engines are all about results that the average consumer wants to see, not about SEO or anything else most of us webmasters might like to think. Just my opinion and I'm very open to listening to other ideas which more experienced folks might have. I've seen Google change so much over the past couple of years, nothing they do really surprises me anymore.
This is just to nerve raking lol. I was just checking our rankings for our primary keywords which we have never been ranked for. I found that we were top 10 for all those key phrases. 3 and 4 word phrases. After getting all excited because this is somewhat of a new site launched Dec 3rd, I went back and searched again from my Dc on google and was nowhere to be found. Google is most definately in flux.
Mark, I agree - I have seen the same effect. But the logic of this is flawed if they are doing this. I write a page answering a common question, (think: d'oh, I knew that, just forgot) on target, simple, understandable, maybe you don't even need to read the page, just look at the image. The vistor heads back to Google to enter another query (maybe not even related) and so Google downgrades the site because it only took the visitor 10 or 15 seconds to get the answer... Logic, scientific method, and philosophy seem to be lacking in the Google brain trust from other flaws I have seen.
Good observations from several above. But I don't think any of the theories explain why consistently a bunch of KW phrases being tracked will be down one day and up the next. This suggests to me that Google is doing something domain-wide in the SERPs.
It's usually just API issues or what server you're hitting, often if you go Google or McDar and do a search you'll find your keyword positions are still there, even thought he tracker showed them dropped.
This is probably the most extreme example I have... though the most jumpy period was in February, it is happening from morning to evening sometimes, and I don't consistently check at the same time of day. BTW, this is not a 'paying' keyword phrase (2 words), but two major words from phrases I am targeting, and there have been no clickthru's since early March. I use it to check the overall health of the more valuable kw phrases. I blame the drop at the end of the graph on Google deindexing many pages from a site that links to me in April.
That reminds me, I had an idea for a search engine - rank the sites, and then randomize the top 10 or 20. How accurately can you actually rank the top sites anyway? Might as well share the first page or two. Look at the v7ndotcom elursrebmem contest results...
Two Theories: 1) I agree! These Google cheerleaders make me want to throw-up. They haven't got there main business priority right(search) in the last 4-5 months...so who says they're so smart. Who makes this many changes without testing thoroughly? There have clearly been attempts at other algo changes that have been backed out (reverting to an older index). Oops, we lost your pages lets start over. Believe me people...I work with a bunch of PhD engineers and most of these guys can't tie their shoes without falling over on their head. 2) On the other hand, maybe its BY DESIGN. Google isn't trying to be consistent at all ...they are trying to mislead SEOs...get us all talking about weird theories. They could just insert a randomizer into the algo. They are probably laughing at us right now... I imagine a forum post or two has made it to the lunch room billboard. Matt C is not our friend ...he is part of the game....to mislead. After all, before his blog popped up, things were relatively stable. What are your theories?
Ugh I just took an across the board hit on the 900 key phrases that I am tracking. Seriously is Google trying to give me a heart attack?
I did a test with refreshing again for instance http://64.233.161.99/ I did a site:www.url.com just a random site it came up with 114 indexed pages I refreshed 4 times and all of a sudden 119 pages refreshed again 114 pages, 119 pages, 114 pages etc the same thing with the search results and this test was with my own site again on the same datacenter http://64.233.161.99/ when I refresh a few times the search changed from pos 16 to 53 nothing in between. Cookies turned off so no high tech bio engineered cookies or anything. also the number of results changed to two and the same numbers. I think they are "just" busy cleaning house
and it aint good I have a web site about dogs and it has been around forever and I have always had a lot of visitors every day and I see from my stats what pages they have landed on and so on. Today, most other pages than the index page has gone from the index which means a drop of 50% in visitors I am not sure what is happening but nothing has been changed for a long time so I don't really understand what is going on, and it seems that not even Google knows what the heck is going on! Like someone said, Matt Cutts is not our friend! I don't like when people want other people to report spam sites. Don't get me wrong!! I hate spam sites and sites with scraped content since they are just lousy thieves, but it reminds me of STASI (East Germany police) and KGB where everybody spies on everybody and you can get reported for something you didn't even do.....