1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Google "Sandbox" Threshold with "Hysteresis"

Discussion in 'Search Engine Optimization' started by hexed, May 21, 2004.

  1. #1
    I've been doing some thinking and research on the entire sandbox effect, and here's a preliminary thoery that just popped in to my head. I also understand that there have been similar theories on the subject and this is not completely new. However, I _believe_, keywords are "I believe", that the theory presented below may be the most accurate of all of them to date. The main point here is _HYSTERESIS_. Please try to understand the concept. I know it may be tough if you've never heard the word before.

    THIS IS JUST A THOERY AND HAS NOT BEEN PROVEN - PLEASE TREAT IT AS SUCH.

    UNTESTED THEORM:

    Google applies a dampening effect to new sites specific to the keyword or keyword combination. The amount of dampening and the length of time it takes to be even listed is proportional to the amount of sites that are already listed in the SERPs for those specific keywords.
    i. ranking threshold - is also applied (see below).
    ii. gluttonous gathering - accumulation of links too quickly will have a global negative effect on the formula/theory (see below).

    EXAMPLES:

    CASE 1: Adding a new site or page to the Google SERPs where there are already 10 pages listed for "some odd keyword combination". This may only have a dampening effect of 10% and a time to be listed (TTBL) of 3 days.

    CASE 2: Adding a new site or page to the Google SERPs where there are already 1,000,000 pages listed for "common keyword". This may have a large dampening effect of 99% and a time to be listed (TTBL) of 3 months.

    RANKING THRESHOLD (WITH HYSTERESIS):
    To further complicate the problem, some sites may have to pass a "ranking threshold" to even be added into the SERPs. This explains why some sites have been in the sandbox for over a year. I'm not talking about a threshold of the 10000th site in the SERPs for that keyword. I'm talking about it may have to pass the threshold of the 50th site in the SERPs for that keyword. If there are any electrical engineers out there, the words/device "Schmitt Trigger" immediately comes to mind. Also - a dynamic factor that seems to be playing in the SERPs is what engineers also call "hysteresis". Even though hysteresis plays an important factor and that it's paramount to understanding the definition, the logical operation of the Schmitt Trigger explains the entire ranking threshold with hyteresis perfectly. Please look again at the Schmitt Trigger.


    GLUTTONOUS GATHERING:
    I also firmly believe that gathering links too quickly may extend the TTBL and increase the dampening effect - for obvious spammy reasons. This has already been seen with the "dream team" nigritude ultramarine site.

    Hexed

    Comments Please!
     
    hexed, May 21, 2004 IP
  2. laiwa

    laiwa Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Yes there is definatly some algoritm that isnt linear. I can relate some to adaptive regulation methods used in industrial engineering. There are very well developed and advanced systems to regulate industrial processes. These can be easily implemented to this ranking process to. It is of course another thing then the actual "preliminary ranking" that is based on online and offline content. This secondary (one of many) function might be applied to regulate the speed of movement in the serps and the relative ranking of the sites. Age can be a factor here, if you look at a basic PID regulator function, it is based on the integral and the derivative as well as the control error itself. If there is a "regulating" function it will of course be a lot more advanced but it should be possible to detect it with some experimentation.
     
    laiwa, May 22, 2004 IP
  3. Foxy

    Foxy Chief Natural Foodie

    Messages:
    1,614
    Likes Received:
    48
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Hexed

    That is nice thinking and fits in with what I have been seeing and not yet analysed - and that is for some "not to well" optimised pages that I have [yeah I know you all will say every page ha ha] I have been seeing them rise, fall out, come back , fall out and so on, and the only ones with stability are the ones that count in the factors that we have discussed elsewhere and are experimenting with. :)

    I look forward to seeing what the others say.
     
    Foxy, May 22, 2004 IP
  4. laiwa

    laiwa Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    If they use a regulation functions it should look someting like this one below, a bit hard maybe to see how it would be implemented right off but a polynom function would do the dampening job atleast. The more the site is "tweked", the more it will be dampened (derivative), also the longer it stays in one position, the more it takes to get it moved from there (integral). The function shown below is actually a quite simple polynom that can do this. If they apply self adaptive functionality, nearly all modern regulators have this, they can actually set the different variables for each keyword. That would explain the different results for "hard" and "easy" keywords. I dont think its threshold in itself, its more that the regulating function adapts to the changes in the serps in some way. Knowing that Google wants to have algoritms doing the job, they must in some way be in to this:
    Simple polynom
     
    laiwa, May 22, 2004 IP
  5. hexed

    hexed Peon

    Messages:
    333
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Thanks for your comments, both of you.

    I'd just like to add that, in the schmitt trigger example, the 5.0v, or Output, would represent if the website was to be included into the SERPs or not. When your website variables (input) reach a certain threshold (v1.7), the switch is flipped and you instantly appear where you crossed the threshold in the SERPs.

    This explains why nobody ever really shows up at 9000 in the SERPs, but rather 100. The threshold would be set at 110 or so for that keyword combination, and until you pass the threshold, you will be filtered and not show up on the SERPs.

    Do I sound insane?

    Hexed
     
    hexed, May 22, 2004 IP
  6. Foxy

    Foxy Chief Natural Foodie

    Messages:
    1,614
    Likes Received:
    48
    Best Answers:
    0
    Trophy Points:
    0
    #6
    No completely rational, talking of which where is he? Hehe

    Thanks Iaiwa I haven't looked at such a formula in a long time. This forum just gets better and better.

    The things we sometimes forget about search engines is that they apply mathematical formulae as determined [read applied] by the humans who designed them, and when applied in the total abstract can become/is an art form.
     
    Foxy, May 22, 2004 IP
  7. laiwa

    laiwa Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    It is quite easy, whithout any greater computer power, for google to apply a secondary "regulator filter" to the serps. Most likely it is some combination of a soft algoritm and a threshold function(s). This also explains why some sites "hang around" for a while after they have been changed or deleted eaven though they are spidered. If Google has gone adaptive, they will weigh the different regulation and threshold variables depending on the competition for the keyword and how much movements there is in the serps for that specific keyword. This could mean that a site that for example, gets lots of backlinks quickly, then triggers the derivative part of the function. This could for a competative keyword generate a major negative penalty, it goes into the clouds. With time the derivative goes down of course and the site goes up in the serps. This could look like a "threshold" but it is actually a derivative function. The polynom could wheigh both onsite and offsite factors in this model. Why I keep speaking about the polynom is because this is the type of functions that Google likes, like the Page Rank function. They dont need very many of these before they really have complicated life for us. The polynom can also quite well model these types of processes that exist in the serps.
     
    laiwa, May 22, 2004 IP
  8. nohaber

    nohaber Well-Known Member

    Messages:
    276
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    138
    #8
    Hexed,
    let me guess. You've never really been a computer programmer. :eek:

    Now, I'll give you my "SEO theorem".

    If you've never been a computer programmer solving at least basic algorithmical problems. If you don't know stuff like algorithm complexity, graph algorithms, NP-completeness etc.etc. If you don't know math etc.etc. If you don't THINK like a programmer or software engineer, if you CAN'T THINK like the guys/gals at google,yahoo, etc. HOW DO YOU MAKE UP SERP THEORIES?

    Let me tell you something. Some years ago I went on my first programming competition. I had a lot of programming experience. I had written so much code in so many different languages and so much difficult stuff like writing in assembler. I thought I knew programming. I got my a** kicked because the only algorithm that I knew was backtracking (trying all possible solutions to a problem, which takes a much more time than allowed on competitions).

    Then I learned something. If you don't have experience with algorithms, you can't solve such tasks. In the following years I got coached by the professor who trained our national university and school programming teams and learned a lot about algorithms, that I even got one first national place and a lot of other local first places. During this time I was drinking like a horse and wrote much less code than before. I was becoming a much lazier and untrained programmer. But I got to win lots of programming competitions even though I was such an alcoholic and out-of-shape programmer.

    The take home message is: if you don't know algorithms, if you haven't solved such problems, if you haven't participated in programming contests where you solve problems efficiently to score points - YOU HAVE NO CLUE OF WHAT YOU ARE TALKING ABOUT. I am telling you, this thing is very specific. Even programmers with lots of years behind their backs can't do well when it comes to programming such tasks efficiently.

    That's about 99% of the SEO experts out there. You have to know what's possible and what's feasible.

    Thank you :cool:
     
    nohaber, May 22, 2004 IP
  9. hexed

    hexed Peon

    Messages:
    333
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #9
    That's really quite funny and I have no idea where you got that impression. Also, I don't know what side of them bed you woke up on because even if I wasn't a programmer, it's a pretty nasty statement to make to someone. I don't want to sound like I'm ego tripping here, but I think it's time to put you in your place.

    I have programmed for over 20 years, probably before you knew what a computer was.

    I am a computer engineer with a master's degree and I code in assembler, c, c++, java, php, perl, the list goes on. I also design autonomous robots from scratch and program them in micro assembler floating point code using a vector-driven based drive system with 360 degree motion.

    I also teach assembler and robotics to university students part-time.

    Just because you got your a** kicked in a competition because all you knew were linear O notation algorithms, don't take it out on people trying to research and development. And if you do, don't go attacking people personally because it's very frowned upon specially when we're trying to assist you. But then again, this is probably why you're making calorie counters and fitness graphs instead of doing some real R&D and engineering.

    I think its time to bite your tongue. :rolleyes:

    Hexed

    PS - Sometimes I don't even know why I try to assist people like you.
     
    hexed, May 22, 2004 IP
  10. nohaber

    nohaber Well-Known Member

    Messages:
    276
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    138
    #10
    hexed,
    you don't get the point. You may have 100 years of programming, but it does not help understanding algorithms. It's all specific. Someone who has developed 10000 php sites, can't solve a problem with dynamic optimization(just an example).

    Now, I have a question. How would google implement this part of your theory?

    "Google applies a dampening effect to new sites specific to the keyword or keyword combination. The amount of dampening and the length of time it takes to be even listed is proportional to the amount of sites that are already listed in the SERPs for those specific keywords."

    Please, be specific. How would you code this into a search engine? :)

    btw. if you live in my country, you'll never want to do R&D because there are better paid jobs ;)
     
    nohaber, May 22, 2004 IP
  11. compar

    compar Peon

    Messages:
    2,705
    Likes Received:
    169
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I've attacked a few people in forums in my day, but I have never seen such a villianous and unprovoked attack as the one by nohaber on hexed.

    It was totally uncalled for and the worse thing of this type we have seen on this forum to date. Hexed, don't allow him to suck you into this any further. And hohaber, learn some manners.

    Now back to the question at hand. How do the mathematician who are attempting to speculate on what Google is doing with their algorithm explain the fact that you seem to be able to circumvent it with a search that includes a series of "-nonsense terms".

    Many of us in the McDar thread have tested our various keywords both with and without the -kfjks -ldsisdl -ljaffsdl -fjalk -kfaafs -laj -lafksjalj. In each case the sites we are actively working on --ie added links in the last few weeks and months -- rank much higher with the "-nonsense terms".
     
    compar, May 22, 2004 IP
  12. schlottke

    schlottke Peon

    Messages:
    2,185
    Likes Received:
    63
    Best Answers:
    0
    Trophy Points:
    0
    #12
    nohaber-

    It is afterall a theory- yet a sound one. You know, at the most, equally as much as the rest of us. You have no place to rip into Hexed the way you did- you are uninformed. Google doesn't need to do it mathmatically- they dont don't follow suit with all algorithm basics. They block results, this by itself is against mathmatics!
     
    schlottke, May 22, 2004 IP
  13. schlottke

    schlottke Peon

    Messages:
    2,185
    Likes Received:
    63
    Best Answers:
    0
    Trophy Points:
    0
    #13
    "Google applies a dampening effect to new sites specific to the keyword or keyword combination. The amount of dampening and the length of time it takes to be even listed is proportional to the amount of sites that are already listed in the SERPs for those specific keywords."


    Well lets think on this, for just one second. "Search Engine Optimization" requires thousands of Backlinks and is Highly competitive. BUYING 10,000 PR8 Links would move you to #1. Google builds into their algo that if a site has 10,000,000 sites and each reaches a specific # set by them to place where they are in the results. Say 1,000,000 of the sites are optimized to a degree-making it difficult to rank high. Google makes this search phrase take longer in the sandbox.

    Then go to a median word like "Football Helmets" these results are less optimized for and have fewer sites involving the term- quicker moves to the top.

    Now you go with a term like: "nohaber" Nobody is optimizing for this at all. In 2 weeks time your profile here (or on another forum) will jump to the #1 spot because it isn't optimized for and no sites are keying that term.

    It makes sense if you use your brain, it really does. I can't say for sure if it is true or not yet but if and when I show up #1 for all my terms in 1-2 months, I'll serve you a plate of crow.
     
    schlottke, May 22, 2004 IP
  14. nohaber

    nohaber Well-Known Member

    Messages:
    276
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    138
    #14
    You still don't get the point, do you?

    You type "keyword1 keyword2". What happens next? What data structures does google use? What's the complexity of implementing the theory? How much memory does it take?

    Can it be run under 0.20 seconds on commodity PCs?

    That's my question. Is it technically feasible? If yes, how??? Be specific.
     
    nohaber, May 22, 2004 IP
  15. rickbender1940

    rickbender1940 Guest

    Messages:
    259
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #15
    Interesting theory. Have you run any tests to confirm parts of it. We'd be interested in hearing the results.
     
    rickbender1940, May 22, 2004 IP
  16. rickbender1940

    rickbender1940 Guest

    Messages:
    259
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #16
    Actually, why don't you post in Webmasterworld. There's a lot of big-time SEO's in there, you'd get quick feedback if anybody has data supporting the theory. And post your Webmasterworld username so we can follow along
     
    rickbender1940, May 22, 2004 IP
  17. Foxy

    Foxy Chief Natural Foodie

    Messages:
    1,614
    Likes Received:
    48
    Best Answers:
    0
    Trophy Points:
    0
    #17
    nohaber you are out of order

    There is a way and a method of querying and accepting the evidence or not - the way that you have come in here smacks of being on the drink - do you still?

    If you have ever been to university - which I doubt - you would know that people are allowed to propose theories without being "put down". Consensus determines that it is accepted or not - not you as a person.

    So I am so incensed by this attack and in the interests of keeping this a friendly and technical forum [now how do I feel there seems to be a lot of aggro arriving recently as this forum beomes known as being interesting and free spirited?] I am going to tell you a bit about me which, as a private person I don't normally do, but you nohaber should.

    I am from an age where we used cards to load in information on IBM 360/30's in a language called Fortran 4 and PL1 was a new kid on the block and assembler was, well, assembler, and micros and the rest were not to appear until Apple in 1978.

    I am a mathematician, and biochemist and rower, and sailer and skier and cook and...I didn't do the programming thing because I found it ...stultifying [enough to send you to drink - as you did].....but I found with the Internet in the 80s [were you born] that there was something that might interest me but it took until 1999 for me to use this medium.

    Now when you [that is you nohaber], have spent time considering the world a little more widely, as I have and many others, and that this theory of Hexed does have merit - well thought through - and backed up by practical examples from my sites as well as others [called consensus] then do try to behave as though you were trained by a University and not by a fish wife.
     
    Foxy, May 22, 2004 IP
  18. Foxy

    Foxy Chief Natural Foodie

    Messages:
    1,614
    Likes Received:
    48
    Best Answers:
    0
    Trophy Points:
    0
    #18
    Why doesn't he just stay here in the better forum and wait?

    What "big-time SEO's in there" ? In Webmasterworld? Oh don't make me laugh!

    Goodness me where did you come from?

    Are you another worried about how good this forum is? :mad:
     
    Foxy, May 22, 2004 IP
    robmuld likes this.
  19. rickbender1940

    rickbender1940 Guest

    Messages:
    259
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #19
    Not at all. I just want to see the biggest pool of data. Wherever the info comes from, is where I am. And there are some guys in there running BIG sites and collections of sites. Some of the affiliate guys are apparently making 5 figures. Mind you, WMW has an annoying policy of no URL's, no specific keywords etc. And a lot of nonsense posted about "Just make your content good, the ranking will come"!!
     
    rickbender1940, May 22, 2004 IP
  20. compar

    compar Peon

    Messages:
    2,705
    Likes Received:
    169
    Best Answers:
    0
    Trophy Points:
    0
    #20
    There maybe a "lot of big-time SEO's" on Webmasterworld, but there are lot of damn good SEOs on this forum. I agree with Foxy, let the "big-time SEO's" come to us. They might learn something.
     
    compar, May 22, 2004 IP