Bingo - you are lucky you were not deleted from the Google index, like phynder.com was - for buying links...
I am not seeing this effect in this update. PR for my site is going from 5 > 6. But I have yet to see new PR in the toolbar. During jagger, I saw my links go from 1700 > 511. This update the backlinks went to 509, despite having some new ones. (I expected a small increase, and yes, I know the backlink count is not accurate.) I think we are seeing some jagger type movements. My theory on this is here: http://forums.digitalpoint.com/showpost.php?p=662539&postcount=739
I believe Google has been cracking down on sites that are known to sell links and reducing or eliminating their ability to pass PR (an "educated" theory).
That is the only "frowned upon" activity that I was involved in - that I am aware of. Your point is well taken - there is no way to know why Google has dropped a site from the index. All I can do is look at the sequence of events and make a judgement. Jan, Feb and Mar I was buying links like a madman for Phynder.com, which was a PR5 site. At the next update, Phynder.com went to PR0 and nothing in the index. Any other ideas?
You have a vaild point, however would you not say your ban was manual and not algo related, for example someone reported your site? If buying links can get you banned (and if I could afford to) I could buy a ton of links for my competitors and get them banned am I wrong?
If Phynder.com had pagerank before, and links other than paid, it would not explain the PR0. I have bought links, but not for the PR or even the rankings, but to ensure I got good indexing, that Google would not rank a 302 redirect pointed at my site ahead of me, and to insure that sites such as belahost and a bevy of proxy cache sites that got crawled about the same time as Matt's post would be recognized as the duplicate, not mine. The flaw in the logic here is that Matt and Google assume that everyone who buys links is out to get higher PR and rankings. There are plenty of other reasons, especially with the bugs Google has suffered from over the last several years to purchase links for the traffic, both human and robotic. Until they eliminate those reasons, or find a way of divining the intent, downgrading the links is not the answer. The links I purchased are one set of site-wide links, as opposed to many from different pages and sites. The anchor text is suffering more than it would if simply disallowed. It seems to be penalized. However, I renewed it with the same anchor text recently until I can check this out further, to see if it will bounce back to a level similar to that I would expect without the paid links.
Did you read the last paragraph? "What if a site wants to buy links purely for visitor click traffic, to build buzz, or to support another site? In that situation, I would use the rel=â€nofollow†attribute. The nofollow tag allows a site to add a link that abstains from being an editorial vote. Using nofollow is a safe way to buy links, because it’s a machine-readable way to specify that a link doesn’t have to be counted as a vote by a search engine."
Did you read my last post? rel="nofollow" means nothing to humans. To robots, it means do not follow the link, which means that the robot will not visit the destination. If Google is dumping the wrong sites for duplicate content, which I feel had happened to me when the proxy trash was really heavy, and I know happened during the 302 redirect problems I had, the only solution is to ensure that Googlebot hits my site more often, with more recent cache dates, etc. It is my philosophy of late to treat the search engine robots the same as my human visitors. If you will think about something as simple as hidden text, you will see that it does the human visitor no good, therefore you should not let a robot run into the same thing. A rel="nofollow" means nothing to a human visitor, so why feed that to a robot?
Not sure what to tell you about your redirect problems, but I would hope you don't rely on paid links to get Googlebot to visit your site more often. As far as showing a bot something that means nothing to human visitor (rel=nofollow) - There are things that they "ask" for and things that they don't want. You can't really compare legit attributes and black hat techniques.
First of all - Sayles - Regarding the link you posted - you think about PR too much. Second, regarding nofollow, I find that G follows them anyway, even put a couple in their index with no cache or description.
I was wondering that myself...if the bot follows the link but it is simply ignored from a PR perspective.
Really just an insurance policy - I was out of it by the time I bought the links. But it helped in the decision, there were few cases of high traffic, high PR sites getting hijacked. This is good reading: http://forums.digitalpoint.com/showthread.php?t=16232 But it was really about having 10 copies of my pages from proxies crawled and included by Google, without much other content... and watching my rankings plummet because of them. When I had a particularly bad one shut down after contacting the host, they began to climb back. Adding the links took care of the rest, and they recovered. rel="nofollow" s a decision by the webmaster of the site the link is on. If Google does not want to score a link, or follow a link, or even include a site, they do not have to. And rel="nofollow" has started popping up without a webmaster letting people know about it - paid links, exchanges, etc. including links that should be considered an 'editorial vote', such as in forums. I guess what I am really saying is that if Google wants to sort paid links versus spontaneous links versus exchanged links, they should smarten up the robots and the algos, not go to the webmasters for help and penalize them if they don't.