anybody using coop seen this... United States Patent Application: 0050071741 some possibly relevant clips... 59. The method of claim 58, wherein the adjusting the ranking includes penalizing the ranking if the longevity indicates a short life for the linkage data and boosting the ranking if the longevity indicates a long life for the linkage data. 61. The method of claim 54, further comprising: determining an indication of link churn for a linking document providing the linkage data; and based on the link churn, adjusting the ranking of the linked document. 62. The method of claim 61, wherein the indication of link churn is computed as a function of an extent to which one or more links provided by the linking document change over time. 63. The method of claim 62, wherein adjusting the ranking includes penalizing the ranking if the link churn is above a threshold. doh. my links churn
hm thats not good, all co-op ranks are short lived. btw where did you find this? Time to start reading...
Google can invent anything they want, but that will not stop the barter of ads from this network or the many others on the internet. This could be worse news for those who buy and sell links than any co-op members in any of the co-op's.
yeah but if they start dropping the rankings for sites that have a link to them from another site 1 day and its gone the next then that would make the network fairly pointless... although it would also make an avenue for dropping the rankings of any site you didn't like
Shawn could change the network real easy to content matched links where members could pick and choose permanent links if that is the case.
From a patent filed 18 months (ish) ago... http://appft1.uspto.gov/netacgi/nph...&s1=20050071741&OS=20050071741&RS=20050071741 Truthfully, it's nothing new... just stuff that everyone knows (at least should have known).
I have no idea what you are doing SEO-wise. And even if I did, no one can guarantee what you are doing won't affect your rankings (good or bad).
The link Shawn gave was the same as the thread starter gave, but the question for Shawn is about the difference in the filing date and the other date (Todays date) on the document, does this mean that the patent was awarded or approved today?
The document validates beliefs that many have long had, but it's nice to see it in black and white (or the color sheme of your choice) from Google itself. While there is much there that is interesting--almost every line is, really--I think that the chief "action items" (by which I mean things we can do or refrain from doing, things more or less under our control) that come out of for SEO purposes are few and simply stated. Whether there is a hierarchy of importance for these is hard to say from the document, but use common sense. 1. Change content: the more, and the more often, the better. 2. Add content pages: the more, and the more often, the better. When measuring "content", some things are more important than others. Unimportant Content: ---------------------------- Javascript comments advertisements navigational elements boilerplate material date/time tags Important Content: ------------------------- title anchor text in forward links 3. Get new links, as many and as often as possible. 4. Keep existing links. 5. Try to keep the text in your inbound anchors unchanging over time. 6. Get AdSense, and work hard on improving your CTR. 7. Get your domains paid up for as long a stretch as you can afford--at least over a year; renew them early. 8. Get on a good host (one that hosts many reputable domains and few or no disreputable ones). 9. Get bookmarked as much as possible. Cautions of significance: =================== A large spike in backlinks is a spam flag. Appearing as a hit for a "discordant" set of queries is a spam flag. (If you feel I've left anything important out, please add it.) The concerns over link churning are, I think, legitimate, but it looks as if the "penalty" is only intended to neutralize false linking, not devastate the site (but with those looney tunes, who can be sure?) In other words, you might gain and you probably won't lose. ---------------------------------- Taken all for all, this thing profoundly validates my judgement that the barnful of PhDs at Google are all refugees from a lunatic asylum. This is absolutely, positively not how to go about giving good search results. I'm too dispirited to go into a blow-by-blow description, but this document simply expounds in excrutiating detail Google's view that so far as SERPs go, the rich should get richer and the poor should get poorer--which is why one finds many, many highly relevant pages for a query out at #50, or #150 or #250 in their listings. Trying to really find all the good information of some topic--something I have been doing a lot lately, as I update some of my badly out-of-date sites--is a tedious nightmare precisely because one has to go so deep into the listings to be fairly sure one has all the valuable pages for the query. As I have said before, and doubtless will again, a good search is not one that returns top results that are mostly relevant pages: it is a search that returns most of the relevant pages in the top results, a very different thing, and one that no search engine yet does.
http://www.wolf-howl.com/2005/04/google-patent-analysis.html These 2 are of importance here... -Links are given a discovery date and monitored for appearance and disappearance over time(section 22,26, 58) -It is determined whether a document has trend of appearing or disappearing links (section 25) - justin
wow, jnm, have read this??: Documents and Pages * Documents are compared for changes in the following o frequency (time frame) o amount of change o (section 6,7,8, 9, 11, 12) * Number of new documents (internal ?) linked to document is recorded (sections 9,13) * Change in the weighting of key terms for the document is recorded (section 10, 14) * Documents are given a staleness (lack of change?) rating (section 19) * The rate at which content of a document changes and it's anchor text changes are recorded (section 31, 33) * Outbound links to low trust or affiliate websites may be an indicator of low quality (section 0089) * Don't change the focus of many documents at once ( section 0128). (...)" So one would want to change content or copy, begin from scratch or mayor changes, better would do some javascript print and leave old site little changed. And look at that: "Nameserver, and Whois data is monitored for changes and valid physical addresses (same technology used in google maps)" Whoa!!! and next Burst link growth may be a strong indicator of search engine spam ( section 0077) Thereware 3 sites that had 150,000 -30,000 links and ware for "hosting" in other country hahaha Cool
Oh and every patent they apply for is the one they use, right? You guys are easy. This is could be/is bait.
* Outbound links to low trust or affiliate websites may be an indicator of low quality (section 0089) Can you cross-check that reference? I found no such statement, and 0089 says, in full: [0089] In one implementation, search engine 125 may compare the average traffic for a document over the last j days (e.g., where j=30) to the average traffic during the month where the document received the most traffic, optionally adjusted for seasonal changes, or during the last k days (e.g., where k=365). Optionally, search engine 125 may identify repeating traffic patterns or perhaps a change in traffic patterns over time. It may be discovered that there are periods when a document is more or less popular (i.e., has more or less traffic), such as during the summer months, on weekends, or during some other seasonal time period. By identifying repeating traffic patterns or changes in traffic patterns, search engine 125 may appropriately adjust its scoring of the document during and outside of these periods.That seems irrelevant to the assertion you made. In fact, I can find few references at all to outbound ("forward") links: at 0051, it says that the anchor text of forward links is considered important when assessing the scope of changes made on a page; and at 0062, it mentions changes in number of forward links as one of a great many things that can be used to assess freshness/staleness. I may be missing something where they use a different term, but the document reflects deeply Google's concern with backlinks and outside factors affecting the ranking, as opposed to on-page content affecting much but estimates of freshness. (Indeed, I see no validation whatever, even implicit, of the often-touted belief that lots of outbound links somehow help a page.)
One thing you absolutely must remember, is any of these factors will also be weighted. For example, I seriously doubt the "staleness/new article" factor is heavily weighted at all - there's so many really old sites out there that are quite the contrary to penalised for staleness, more like they seem to be given the benefits of being an "established authority" site.
Excellent point. Look at this Google search. Note the #3 result -- a newspaper story that is more than 5 years old but which has been in the top 5-10 for that search term for at least 3-4 years. Staleness?
Yes, right now Google loves old sites, sites that have been indexed for a long time without being penalized that is. but they must be aware of the fact that those sites are not always satisfying result wise to the web searchers. I am sure they have taken that into consideration for their new algorithm.