Well my fellow spirit walker I did read your post. It is more of an observation without conclusion. Really though, I don't see how there could be a tool like that! We have the Google patent now that, I am totally convinced, sandboxes ibls for an undertemined period of time. We have other se's to consider that are currently murdering (deindexing) websites. Apparenty all you have to do is COMPLAIN about a wesite to Yahoo and they will pretty well ban the site with thier eyes closed. My point is with so many other factors and algo updates how can one predict the value in $ of a link?
Well the conclusion was pretty obvious to me, but I was trying to spawn discussion more than pontificate about the "truth". I wondered if others had done this test and agreed, or if other tests had lead to other theories about the items the tool uses in it's evaluation. It would be an interesting exercise to reverse engineer the tool. That's what I was hoping would happen with this thread.
Validate it against what standard? That's exactly the problem if we don't know how to calculate it we certainly don't know how to validate it. Do you have any suggestions?
The only reason I didn't want to pursue this is I see very little (if any) truth or value to this tool. As descibed in my post above there are way too many other variables that currently would influence this on a daily basis. Just when you think you've got everything right, Google/ Yahoo change the rules, making this tool next to impossible to maintain. I always enjoy a good puzzle and would love to break it down with you Compar but it's a waste of time, IMO. The truth is there is no truth here .