A few words about natural links vs. automated links

Discussion in 'Link Development' started by RyanDan, Nov 21, 2011.

  1. #1
    What is natural? Seriously, let’s just say, and sorry for the example if you’re not religious, but it is the best example I could think of. Let’s say Jesus descended from the sky tomorrow. Don't you think that news sites would be linked to it in every way imaginable and do you think these links would only come from dofollow sites and sites with a PR 3 or more? Now let’s say that Jesus, for some reason, decided that he was going to reveal himself to only a small group of people that just so happens to stream video on a low linked, ranks for almost nothing site. He produces miracles for them. Turning water into wine and healing people around the world from a house where one of site owners live and it is all streamed live on their site.

    Imagine the links this site would receive in just a month alone. Millions of not billions of people in all shapes and forms would link to this site. Do you really think Google would ban this site just because it received too many links to fast or because a huge amount of low PR sites linked to it? Is this not the the so called natural links you talk about? Would this not be the natural way a site with something amazing to hear, view or see be linked?
    It reminds me of a person that is superstitious, don’t walk under a ladder or break a mirror because it will bring you bad luck. The reality is it’s only the belief of having the bad luck that creates it. You believe that you are unlucky so when something bad happens like a tree falls on your house, you blame it on being unlucky. In all actuality, its nature just being nature. Trees fall on many houses every year and I’m sure not all the owners walked under a ladder or broke a mirror.

    I’m not going to sit and tell you that I have it all figured out. I’m just speaking from experience and what I have seen. Think about it if Google only rewards those that build links “naturally” and slow then why would they allow us to have links from any area on a site that allows us to put it there ourselves? If Google cared and ranked a site in the search engine or PR by sites that linked to it naturally then it would just discard any link from an area of a site that allowed us to put it there. It would count only the links that was put in an area that the site owner could control.

    This is what I have experienced. Let’s take my latest site for example; it is less than 6 months old and is on the first page of Google for numerous keywords. Key words that get thousands if not tens of thousands exact searches a month providing me with a small stream of income. This site once again less than 6 months old has thousands of links from articles, profiles and social bookmarks with a small amount of blog comments. 99% of these links have been created from automated software. I’m not condoning spam because we all hate it. You can use software and keep from be spamming if you do it right.

    What I have seen is sites less than a year old with thousands of links on page one in the first position for a keyword that gets 30,000 exact searches a month and has millions of competing sites. That goes in the face of many things I read on this forum that people say over and over. Things like if you build to many links to a new domain that it will get sandboxed for a year or more and if you build to many links to fast Google will see them as spam and penalize you for it. I seen one site that was around a year old with around 40,000 links in the first position of page one. That is 3,333 links a month, 833.25 links a week or a little less than 120 links a day all from forum profiles, link directories, social bookmarking and articles.

    That means if you sleep 8 hours a night you would have to build 7.5 links every hour, 16 hours a day, every day. Sounds to me like someone was doing some automated linking and yet is number one for 30,000 exact searches a month and everyone says “only build links naturally because that’s what Google wants and likes.” Yes some sites do get “sandboxed” but I would say the sandbox is just Google. Every site indexed with decent content gets a few days or even weeks up front in Google before dropping off the map to the back pages. I think this is Googles way of showing people this is a new site, check it out.

    People then say “my site was sandboxed!” When the truth is it just got its proper ranking from Google and is not that good of a site (yet) in its eyes. People then start the “natural” and slow link building as not to make Google anymore upset then they already have and one day the site is back on the first few pages. Not because the sandbox released them but because over the months they have built enough content and links to their site that Google now favors it.

    Had they quickly added content and links to it they would had been back on the first handful of pages in probably weeks after the sandbox. Google loves content and because of this you can rank well if you just write a lot of unique content before ever even registering your domain. Then once you set up your site add lots of content to it immediately. Then by the time your site is indexed and hopefully before the sandbox Google sees your site as an authority. I even throw some links at it from blog commenting within the first few days. As soon as it is indexed and blog comments because they are indexed and count almost immediately.

    If you want PR then fill your site with lots of content. It doesn’t even have to be that unique. I know people that set up backlink indexing blogs that pull all of their content from ezine RSS feeds and they achieve PRs or 2,3 and even 4. I’m not saying to set up link farms and after PRed try to sell links on them because nobody will pay for them. But what I am saying is even from lots of duplicate content you can achieve PR. Then you can sell the domain of sell links on the site. That is why article marketing is a great tactic in link building.

    If you want to rank well in the search engine then build a decent amount of content and a lot of backlinks. If you are that worried about building backlinks to fast then set up web 2.0 buffer and link wheels and backlink the crap out of them. This will help pass the link juice to your main site. To be honest with you whether it is automated or not, fast or slow it upset Google. I will give you an example that I have posted in a few threads before so sorry if I’m repeating myself.

    I use Google Webmaster Tools to keep track of links to my site, like many others. If you use GWT then you have probably noticed that Google updates your links around once a month or so. In between these updates I am constantly building links to my site. Weeks and weeks go by with no new links added according to Google. Then one day boom I have hundreds of new links added. So, if Google was so concerned with keeping track of how many links a site gets in any given amount of time then why wouldn’t it update your links as soon as they are indexed?

    I will tell you why, because there is no way for Google to tell the importance of what is displayed on your site to the world except by the number of backlinks it gets. The algorithm can get a basic understanding of what your site is about from the repetition of words. It cannot and is not set up to judge for itself what the world thinks is interesting. Thus we had the birth of the backlink and that is the basic foundation on which the algo was built off of.

    I know the debate will rage on and on and I know some people will totally disagree with what I have said. I know people will still post threads asking “why can’t I get my site on page one?” That is the great part about the internet, freedom, freedom of speech (for the most part) and the freedom to form our own opinions. All I can say is stop being so afraid. I know every site is a labor of love but stop being afraid.

    Thanks for reading,
    Ryan

    “No passion so effectually robs the mind of all its powers of acting and reasoning as fear.”
    EDMUND BURKE

    PS: I didn’t intend this thread to be a religious debate so please don’t turn it in to one, religious or not.

    PSS: Here is a little test can you tell which links are created by a human and which are created by an automated program if any?

    (two social bookmark sites)

    Site PR 0 http://www.ies-radiantbarrier.com/add-story/search.php?q=1%2F2+a+cent+article+links&submit.x=39&submit.y=18
    Site PR 0 http://huzoo.com/stories/214121/12_a_cent_article_links.html

    (two forum profiles)
    Site PR 4 http://www.les4800.com/user_detail.php?u=falcon
    Site PR 3 http://dogdirectory.citizencanine.org/user_detail.php?u=ryandan
    If you can’t tell which links which what makes you think an algorithm can?
     
    RyanDan, Nov 21, 2011 IP
  2. geester1

    geester1 Well-Known Member

    Messages:
    2,250
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    105
    #2
    I would say the profile links are automatic using Xrumer.
     
    geester1, Nov 22, 2011 IP
  3. indianseocompany

    indianseocompany Peon

    Messages:
    204
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Nice post. Though I prefer manual linking to be the best to get quality links.
     
    indianseocompany, Nov 23, 2011 IP
  4. SerVision

    SerVision Peon

    Messages:
    182
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I have always built links manually and was often laughed at my slow pace and the less number of links that I used to build. But after the recent Panda update I have noticed that a lot of sites have dropped from the search results or have decreased the PR bu the PR of my site jumped from 1 to 3.It pays.
     
    SerVision, Dec 2, 2011 IP