Well it's been a while since I last came past here, having been somewhat busy with one of my sites that has slowly started to take off, but needed to say something: 1. Rant However the amount of time I waste on manual linkbuilding (for only slight effect, as I can't afford to pay an SEO team!) instead of improving my site, and more recently on adding in a few videos (on the off chance that this is one of google's new variables) even though my site's subject isn't particuarly well suited to this, but my competitors are doing it. Has left me really annoyed with google who seem interested in maintaing the status quo for the profitable search terms - as in ensuring the companies that can afford to pay them for SEM stay at the top. Where as the startups who may (not often, but sometimes) offer a better service (for that specific area) are nowwhere to be found as they can't afford the writers, seo services, adverts etc. that the large companies can. Have google forgot where they came from? And now don't care as long as the cash keeps coming in, what happened to "do no evil" google are "the new evil - monopoly"! Frankly these day's yahoo's search seems more intelligent - just slower. This has led me to start thinking about what can be done, to as it were help the geniunely good startups and teach google a lesson into the bargain . Now google is still interested in primarily one key factor, backlinks (well content is fairly high but not so much). which inveitably maintains the status quo - as the high ranking sites if moderately useful are going to generate links naturally, their higher serps means traffics (so user's posting in forums etc. about them as they are the first they find - or see on a tv ad) and other webmasters linking to these trusted/authority sites (the rel nofollow attribute has only further served to enforce the status quo as webmasters link to trusted sites only and rel nofollow all others to save themselves from the risk of accidentaly linking to questionable sites, and the penalties they would get as a result). Add in to this that many sectors won't naturally generate many backlinks (non-tech savy users) and only sites with cash to spare can afford SEO (translation - backlinks) and you see many a good site lying in a google wasteland on page 60 for their main keyword behind spam or authority sites if they are lucky. On top of all this is the ability for authority/high pr sites to rank high for mentioning a single keyword on a page (even if they don't offer anything useful for that keyword) and the situation gets all the more depressing. As not much can be done about a site's content (and why on earth should google rank content so highly - computers aren't libraries, though they can be -) surely the site that delivers what you want and not a few thousand of pages of garbage, e.g. consider a search for flights, all the #1 sites have pages upon pages of useless content just to help their rank (along with lots of artifical backlinks that they never get penalized for), when all that is needed is one page with a powerful well coded flights search & listing control "now that site should rank #1 but never will in google's world", as user's will enter the site, find what they want and quickly leave. 2. Link Network Suggestion Ah well, now that my rant is over! what can be done. Well until google gets their algo working properly! good sites need a cheap, quick way of getting decent backlinks - fair enough google goes after paid links, well they also go after link networks and this is to my mind where things start getting a bit blurred. Frankly I feel that most of the link networks up till now have been poor quality and flawed, as in little control over who you link to and easily detectable by google, so I've been thinking about this a lot over the last few months and come up with the following. 1. What is required - A way of sites acquiring backlinks at an exponential rate, from numerous unique sources, ideally with the majority on category and from country related sites and the participation of sites in such a network must remain undetectable. a. So sites are presented with related sites to link to (they can pick which) and in return receive links back from mainly related sites. b. The sites must remain fully in control of who they link to. c. The network must be very large to compete with the authority sites. d. Three way linking is encouraged. e. The network must be completly undetectable.So to implement this I have come up with the following: a. A user registers on the network, and is given an id and logon b. A user then selects the sites or category or site quality, etc. that they want to link to (from those sites with link credits - see c.) and adds the links to their sites. c. Their outgoing links are verified (through direct checks, checks faked using googlebots agent, and checking googles and others caches). c. In return the user receives x amount of link credits, based on the quality of their outgoing links (relevance, number, pages, pr, etc.). d. They place their link requests for links to their site. e. Other users links to them, and their link credits are deducted. f. As the sites pr grows/falls, etc. link credit is recalulcated and adjusted reguarly, and users notified to adjust their links if required - or is done automatically?To automate this I would suggest the following: a. A simple website with an admin screen, and easy way of reporting "bad sites". b. E-mail notification of link changes, and reported "bad sites" c. Links are not served up in the form of a convential plugin, instead there is a web service, which is used to download an xml file containing the user's links (downloaded daily to reduce bandwith) d. A user community provides various implementations for incorporating the links into a site automatically (though most coders will implement their own methods to ensure they aren't detected.) e.g. in-line, footer links, etc. the options are endless and undetectable as the links are simply in an xml list, which although harder for a site to incoporate has the significant advantage of being undetectable e. AND MOST IMPORTANTLY, 50% of the links in the network are dummy, as in sites not participating in the network, just added anyhow. While this halves the values of the links sites in the network will receive it also ensures that google cannot penalize sites in the network as even if they join the network how can they work out who is really in it or not ( as most links are three way or even more, or may not be present on the targeted site!). That way the network is secure and user's can counteract the pr bleed by grouping the outgoing links into one set of pages, whilst the incoming links point at another set.... so their high pr pages are the sites they want user's to find. Further thoughts: Should there be forced links, to encourage all sites (users can report bad sites) to gain at least some links not just the best. Anyhow just wondered what the interest in such a solution would be, not sure that I will have the time to code it, and even if I did it would probably take me around 6 months to a year in between my day job and working on my sites for the basic implementation. But might well have a shot at it if the demand is sufficent. Guess it could be paid for either through ads on the support forum or a few dollars admin fee for each site in the network. Please feel free to spread this idea/suggest alternatives, or mod it, the idea is to get the word out, as I'm fed up with google's current approach which they can only get away with because they are a monopoly, worse than MS in my mind. And if somebody else takes this idea and runs with it, then great! where can I sign up! Message to google, if you ever get this!: You will never have an amazing search engine whilst you rely on outdated techniques to rank sites and in doing so you are destroying the natural linking and evolution of the web, and enforcing the status quo rather than supporting sites which are better (maybe not revolutionary just better) than a lot of what's out there at the moment - though not always the case, and sometimes the status quo maybe should be maintained but you don't know when! As while you try and analyze user's behaviour you can never get a true picture as if they can't find the sites they want for the relevant keyword you can never observe their behaviour in sufficent quantities to make a decision - or if they do when should they stay on a site or be lead onto others? You will not manage to automate search rankings anytime soon (due to human inguenity & cash resources vs automated software and occasional checks) and you should consider paid manual reviews instead. Whereby a site pays a reasonable annual fee (based on country) for a review, which is carried out with an inital review and (then monthly or as appropriate) reviews afterwards and ranked according to that review. Any site that really cares, will happily pay the fee (consider yahoo's directory for instance), and offer it free to charities/ngos etc. You'll make plenty and user's will have the best search ever. Anyhow I'm off now to work on my new program to examine the backlinks of my competitors and find submission forms or e-mail addresses on their linking site, instead of spending time geocoding more locations for my website to improve it's search or improving it's performance, thanks google!!!! :-( your website guidelines are a joke in the real world. Why? because why is a site delivering a useful service nowhere to be found for it's main keyword, whilst garbage sites (nothing but affilate links & directories of addresses, though with lots of artifical links) are #2, 3 and 4 and in addition the smallest, least popular company in the trade ranks #1 for a large number of artifically acquired links (they spent a lot of money on SEO and content!). Also must say thanks serplink and the coop network here for the basis of the ideas. Will try and stop past this thread to see if it's going anywhere but not sure I'll have time to post much more of use, already spent to long on this post