1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Solving the Google Sandbox

Discussion in 'Search Engine Optimization' started by I. Brian, Dec 2, 2004.

  1. gchaney

    gchaney Peon

    Messages:
    144
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #41
    G does not show all the links and I believe that is again intentional. Google does not want it easy for SEO's to identify links for SEO purposes. Remember, links are suppose to be natural VOTE's a natural vote does not require you to know where all the votes are coming from....lol

    BTW, I didn't say they removed the links, I think they take away the value.

    As this is a theory based upon the intent of PR vs what we are seeing, I believe yes, G recognizes there is "value" in "A" site linking to "B" site. My rambling was long enough...however, I do believe there is a "Factor" as a percentage of votes vs recip links vs anchor text and consequence.

    The adjustment most definitely must be complicated. This was simply a general analysis of the root cause and affect as many new sites start out doing mass link exchanges forgetting one way linking.
     
    gchaney, Dec 2, 2004 IP
  2. I. Brian

    I. Brian Business consultant

    Messages:
    810
    Likes Received:
    59
    Best Answers:
    1
    Trophy Points:
    145
    #42
    Crikey – this thread grew fast. :)

    The point about the low PR links was *never* that low PR links can harm a site – merely that they will not be counted for a delayed amount of time. The expectation would be that they would normally be counted and their “vote” applied without a forced delay period.

    While I very much agree with mcdar that there is the expectation that low PR links will take longer to index – the problem is that even after these links are indexed, they are not impacting the SERPs when they should normally be expected to – my understanding is that this is entirely where and why the Google Sandbox has become an issue of such discussion.

    I have to confess to not having read followed the previous discussion on off-topic linking – but my commercial experience is that off-topic links have value and in number can be extremely effective. Obviously, though, some degree of on-topic linking is preferred in the mix.

    Also, the point that adding links fast being “devastating” for rankings might well be addressed by issues of far too similar anchor text, which is a going concern in SEO. Part of the general SEO procedure often involves “naturalising” the links, which also allows for a wider scope of search terms to be impacted.

    The point there is that the co-op network includes a number of higher PR pages. Certainly there are a lot of low PR pages involved – we’re often talking about sitewide linkage, so of course they will be in number. However, the co-op network also involves a lot of higher PR pages – index pages, main navigation pages.

    According to the suggestion in the report, it is precisely these “higher quality” “recommendations” that Google is taking positively into consideration, and my contention is that this widespread “recommendation” by so many higher PR pages across a number of sites is precisely why the Co-op network is allowing links to impact the SERPs with almost immediate effect.

    After all, the big question is why the Co-op network avoids sandboxing – I’m simply putting forward the PageRank aspect as a contending possibility.

    Certainly John Loch’s comments ring true, too – the Co-op network is no substitute for commercial SEO in highly competitive areas. But, again, I can set up a few thousand low PR links, see them indexed 3 months ago, and not see the anchor text impact even a non-competitive term. But the moment I add that to the Co-op network, my keyphrase was top of the SERPs within a week. That’s the big incongruity I’m trying to address.

    Part of the problem is that toolbar PR is not being accurately reported – there is definitely a good lag on the actual value of PR being given, which was mentioned in the report.

    Also remember that dynamic forum pages traditionally never show PageRank on the toolbar – that has always been an issue. Even when the backlinks were only supposed to show PR4+ backlinks, dynamic forum threads with .php extensions have always come up as zero. It was only their showing on backlink updates that suggested they had any PR at all – but even then, if PR updates constantly, as is suspected, then those threads in backlinks will certainly be showing an out-of-date PageRank anyway.

    As for Google’s attitude towards reciprocal links – certainly I’ve seen *a lot* of debate centered on whether Google is even counting such links (the old “links.htm” controversy comes to mind). However, the majority of what I’m referring to is one way linkage. I actually mentioned recips as something of an addendum, but I hope that discussion point does not distract too much from the main contentions I’ve raised.
     
    I. Brian, Dec 2, 2004 IP
  3. longcall911

    longcall911 Peon

    Messages:
    1,672
    Likes Received:
    87
    Best Answers:
    0
    Trophy Points:
    0
    #43
    Hi Brian,

    Taking your thought a step farther, isn’t it true that high PR sites are crawled more frequently and more deeply than low PR sites? If so, it stands to reason that their pages would also be analyzed (links, keywords, titles, etc) more quickly than low PR sites.

    That might explain the coop’s faster impact on SERPs.

    /*tom*/
     
    longcall911, Dec 2, 2004 IP
  4. john_loch

    john_loch Rodent Slayer

    Messages:
    1,294
    Likes Received:
    66
    Best Answers:
    0
    Trophy Points:
    138
    #44
    Actually longcall, that's where things get interesting. Very true that high PR pages cop a crawl daily if not hourly, but I just saw a link newly introduced to the network appear across 30+ sites (well, 30 that caught my attention out of 5000+), all of which have PR0. It was introduced i think 3 days ago.

    The interesting apsect of this is not so much the PR for me. For me it's given rise to an alternate possibility. When someone lands at a site, sooner or later the internal linkage lands them at a hub, usually the home page. What if a link from the internal hub (which will obviously by way of link structure also have the greater PR) actually yields greater benefit to the recipient.

    By default, you could have PR0, but of course, be breaking the latest research news or whatever. Gfodder big-time. Is it not possible that this internal hub is primary, with the PR a natural second ?

    Above and beyond everything else though, Occys shaver wins out. At the end of the day if the process of elimination leaves you with only 1 solution, no matter how unlikely.. ;)

    I still believe theme is very significant, because just as you deal with local hubs, thematic hubs (hub/auth sites) also make the most sense. I suppose for me it's always a matter of remembering that the technology is designed to assess human information structures, and not VV (although it certainly keeps me busy) :).

    Cheers

    JL
     
    john_loch, Dec 2, 2004 IP
  5. longcall911

    longcall911 Peon

    Messages:
    1,672
    Likes Received:
    87
    Best Answers:
    0
    Trophy Points:
    0
    #45
    Yes it is very possible.

    I think I’m following your hub example, but am at a bit of a disadvantage because I don’t know the coop at all. My basic understanding will probably suffice though. My question would be, what physical, measurable characteristics are typical of a coop hub?

    Is it possible that they demonstrate large blocks of content that change frequently, perhaps hourly? Now if those blocks were structured as we would expect news to be, that is with a heading here or there, some content related to the heading, and maybe a scattered date or two, we have a number of attributes that are very measurable.

    We all know GG prides itself on being able to get news content indexed quickly. I think it is safe to assume that GG follows a set of rules to identify a *news* site as opposed to humans doing it on a site-by-site basis.

    So, does the coop happen to pass most of the heuristic rules therefore giving the appearance of a news site and therefore receiving the benefits? That would explain rapid indexing of PR0 pages, and maybe even the indexing of brand new pages. Perhaps news sites and portals are exempt (to some degree) from sandbox periods. That gets right back to Brian’s original sandbox point.

    It would be pretty funny if GG thinks the coop is a monster news portal. :)
     
    longcall911, Dec 2, 2004 IP
  6. digitalpoint

    digitalpoint Overlord of no one Staff

    Messages:
    38,333
    Likes Received:
    2,613
    Best Answers:
    462
    Trophy Points:
    710
    Digital Goods:
    29
    #46
    Personally, I don't think there is any sort of sandbox for pages. Just sites/domains as a whole. That's why topics in this forum rank high the day after they are posted, and new random junk on my personal blog (and everyone else's I would assume) always ranks high.
     
    digitalpoint, Dec 2, 2004 IP
  7. chachi

    chachi The other Jason

    Messages:
    1,600
    Likes Received:
    57
    Best Answers:
    0
    Trophy Points:
    0
    #47
    Well, I think that most of what Brian had to say was definitely food for thought. I have always felt that there is some kind of sandboxing or incubation period for the links pointing to a domain or page. After reading through this thread and sniffing the paint fumes from the paint outside, I am thinking that maybe the filter G is using (if they are) only allows a site to receive a certain number of links (votes) based on the time in the index. We all know that new sites will not rank for semi-competitive to competitive terms right away...but, we don't know why. Perhaps your new site is only allowed to get the voting power from say 100 links per month in the G index (for example)? Perhaps the number of pages with x amount of content on them bumps that number up....as it seems large (5k+ pages) new sites can build links (with the link: operator) more quickly than say a 20 page site?

    If something like that were true it may explain why DP.com is able to rank so quickly for new terms. And, why DP.com is able to really benefit from the coop links (as it presently shows 90k+ pages (non-api)) and a gazillion links.

    Incubating the links and sites that way would make sense to me if I were G. I would want to encourage new sites to be included in the index, but not want them to be able to be the BMOC for a while. I would want to feel out the sites first and see how the behave.

    Hopefully this makes some kind of sense...I need some fresh air. :)
     
    chachi, Dec 2, 2004 IP
  8. digitalpoint

    digitalpoint Overlord of no one Staff

    Messages:
    38,333
    Likes Received:
    2,613
    Best Answers:
    462
    Trophy Points:
    710
    Digital Goods:
    29
    #48
    To be honest, I don't really think the coop ad network bypasses the sandbox. There are many sites in the coop ad network that are #1 for allinanchor but no where in the normal query results. For the ones I happen to know about, the only thing they share is that they are all domains that were registered within the last few months.
     
    digitalpoint, Dec 3, 2004 IP
  9. I. Brian

    I. Brian Business consultant

    Messages:
    810
    Likes Received:
    59
    Best Answers:
    1
    Trophy Points:
    145
    #49
    The question of hub has certainly been an idea - but if we go all the way back to the original Google paper, "PageRank" and "authority" were originally supposed to be linked conceptually anyway.

    Indeed, Barry disagrees too:
    http://www.seroundtable.com/archives/001218.html

    So I've set up a quick little test for a page on a very new site, using the Co-op network: (third post down)
    http://www.platinax.co.uk/community/thread1408.html

    Certainly there is going to be lots of low PR linkage involved, which my initial idea suggests isn't going to impact SERPs anytime soon.

    However, if PageRank is a major factor, then the larger number of higher PageRank pages are enough to help move sites up to some degree in the listings. Obviously, the degree of active competition in the SERPs is likely going to be a factor.

    As before, the key question I'm trying to address, though, is why links on the Co-op network can have any kind of relatively quick impact on rankings at all - certainly when other links may not.
     
    I. Brian, Dec 3, 2004 IP
  10. digitalpoint

    digitalpoint Overlord of no one Staff

    Messages:
    38,333
    Likes Received:
    2,613
    Best Answers:
    462
    Trophy Points:
    710
    Digital Goods:
    29
    #50
    It may just be as simple as pages with PageRank are spidered more regularly, so they are seen faster. Check allinanchor for eharmony for example. #1 and #2 (above even eharmony.com). link: function shows a ton, but the domain itself it relatively new. Another similar example is hummer dealer (link: doesn't show much on that one because it's very new to the coop). But I expect they will be top 5 on their terms at the very least whenever the sandbox is done.

    They also have some good solid static PR6 links as well (not from the coop).
     
    digitalpoint, Dec 3, 2004 IP
  11. Elee

    Elee Well-Known Member

    Messages:
    407
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    108
    #51
    Can anyone tell me how long a site is typically "sandboxed"??
     
    Elee, Dec 3, 2004 IP
  12. I. Brian

    I. Brian Business consultant

    Messages:
    810
    Likes Received:
    59
    Best Answers:
    1
    Trophy Points:
    145
    #52
    Actually, if Google are using an automated WHOIS query to give weight to site links based on length of time domain registered, then that might also explain the exact same thing.

    So the issue of PageRank could be entirely incidental - in other words, older more established sites will generally have respectable PR anyway.
     
    I. Brian, Dec 3, 2004 IP
  13. darksat

    darksat Guest

    Messages:
    1,239
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #53
    They wouldnt need to, they could just take the age of the site from when googlebot first found it, it mightnt be as accurate in the technical sense but it would be as effective.
     
    darksat, Dec 3, 2004 IP
  14. longcall911

    longcall911 Peon

    Messages:
    1,672
    Likes Received:
    87
    Best Answers:
    0
    Trophy Points:
    0
    #54
    I would be inclined to credit site age more so than PR, although I’m not suggesting that PR is unimportant. I think it is all about *trust*. That is, Google’s development of a level of trust, over time.

    If I were GG, I would trust no one. I’d assume that every new site was out to deceive me. So, I’d sandbox every new site until I knew more about it. If I started to see links from other sites, and saw regularly added content, and if I saw general stability, I’d let it out of the box and watch it closely.

    Over time, if all seemed normal, I would trust it more. As the site matured, and once it gained my full trust, I would take clear notice of the sites (pages) it links to.

    If another new site came along, and if the first (fully trusted) site linked to it, this would boost my trust in the new site. In other words, the new site would *inherit* my trust. I would therefore crawl this new site more often, and I would analyze its pages faster, and I would remove some of the ‘rank inhibitors’ I had placed on it, allowing it to rise to its natural position in listings.

    I think that if Google trusts a site, it can do a lot of things (like SEO) and not be penalized. I have a close friend who I trust completely. I might say to him, “I’m going away for the weekend. Here are the keys to my house”. And because I trust him, I trust his friends.

    His friends inherit my trust. This is the exact behavior I see in Google, although I’m still waiting for keys to the house. :) The point is, the co-op may be seen as a group of highly trusted sites.
     
    longcall911, Dec 3, 2004 IP
    I, Brian likes this.
  15. I. Brian

    I. Brian Business consultant

    Messages:
    810
    Likes Received:
    59
    Best Answers:
    1
    Trophy Points:
    145
    #55
    A very fair point indeed - and the time the site has been in Google's index, as opposed to a WHOIS query, is definitely an angle I overlooked.
     
    I. Brian, Dec 4, 2004 IP
  16. zez

    zez Peon

    Messages:
    193
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #56
    I have seen many Sandbox theories, but the idea that Google start to 'trust' a site after certain amount of time in my opinion is most propable. I have a site, which after 1 year suddenly has started to have a very good SERP on many competitive keywords. The site is only PR3 and does not have many links. Within this site I have page which was recently added and it has PR 0 but this page is doing is suprisingly very well.
     
    zez, Dec 8, 2004 IP
  17. 4Comparison

    4Comparison punkah walla

    Messages:
    23
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #57
    While reading the entire thread, age keeps coming up as an under-rated variable.
    When developing a new site (non-commercial), most links are acquired from others eager to exchange links, and are mostly new sites leading to a delay in the effectiveness of the links.
    Could the Co-op's substantial growth over the last 6 months or so be the influx of older sites, thus giving a boost of "trust" to the network?

    Additionally, on another thread, someone wrote of their disgust with the sandbox and moved their entire site to a sub-domain of an old, established site and got almost immediate high rankings.
     
    4Comparison, Dec 9, 2004 IP
  18. Catfish

    Catfish Peon

    Messages:
    117
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #58
    1) I agree with Shawn that the sandbox is URL/Domain related. We run a large network of sites that are all hosted on unique class C ips and have excellent page rank. We interlink these sites using triangular exchanges and the resulting page rank is usually a 5 or 6 to the homepage of any new site we put up. Typically our sandbox is running about 3 months right now for most new sites.

    2) I don't believe that reciprocal links are being penalized or most of our sites would have dropped dramatically in the SERPs. People's speculation of how things should be or what makes sense for a search engine to do, often times clouds their view of what is happening in the real world. No matter how many times Doug (from ihelpyouservices.com) and some other white hats try to tell me that reciprocal links are a waste of time, the proof is in the listings.

    Most people think page rank is dead too. But its not. I betcha Shawn doesn't think page rank is dead (hi Shawn). I belive that page rank is why Google bombing can still take place in the example where someone links to Google or some other high PR site with a link like www.google.com?keyword and suddenly Google ranks highly for the keyword.

    But regardless of the page rank argument (sorry I was drifting), reciprocal links are not dead and they still work. The trick is to know how to identify the pages that will help the most and try to acquire those links. I have seen lots of forum posts where people are submitting to every directory under the sun thinking that will help their backlinks when most of these directories don't pass PR because of their linking structure. If TSPR is in play, it would seem to me that the best link partners are ones listed in DMOZ in a related category to your own. Anyway, thats been my experience. Hope it helps.
     
    Catfish, Dec 10, 2004 IP
  19. longcall911

    longcall911 Peon

    Messages:
    1,672
    Likes Received:
    87
    Best Answers:
    0
    Trophy Points:
    0
    #59
    If you choose to consider the concept of trust, then one could further clarify the issue by saying that age is in fact important, but that a site’s behavioral pattern over time, and the trustworthiness of its link partners also figure in.

    Interlinking between older sites that seem to have honored Google’s golden rule “don’t do anything that attempts to deceive us in any way” (the co-op may fit this profile) may receive the benefit of Google’s trust. . . higher rankings and shorter sandbox periods for newly added sites/pages. (trustFactor = 1.0 as opposed to .85 for example)
     
    longcall911, Dec 11, 2004 IP
  20. webvivre

    webvivre Peon

    Messages:
    249
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #60
    Does this mean......

    a) Non relevant reciprocal Links from older sites / authority sites are acceptable in Google's eyes?
    b) We should be looking for reciprocal links from older sites because they effectively carry more weight?

    Does old site = good site in Google's eyes? Don't think so.....
     
    webvivre, Dec 13, 2004 IP