Do http errors affect a websites ranking position?

Discussion in 'Search Engine Optimization' started by libertines, Apr 10, 2008.

  1. #1
    I know that if a page you want to be ranking for appears with a http error or a bad request then this will obviously affect the ability to rank but say there are http errors throughout your site (say 15 in a 300 page site) would this be detrimental to the rest of the pages that you want to be ranking for?

    Do you think Google takles the usability and functionality of a website into account?

    I dont think there has been any testing in this area and would be interesting to see.

    Any thoughts would be appreciated.
     
    libertines, Apr 10, 2008 IP
  2. whittier

    whittier Active Member

    Messages:
    43
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    71
    #2
    If the HTTP error causes an incomplete crawl of a page or other pages on your site, the "uncrawled" material will not be indexed or appear in the search results.

    Also, if the page does not render properly, there is less likelihood of others linking to it. Eventually, a big price is paid for defective code.
     
    whittier, Apr 10, 2008 IP
  3. libertines

    libertines Peon

    Messages:
    133
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Thanks for the reply whittier, very good points, i know and understand that but what i wanted to know whether having 20 404 error pages in a 300 page deep site would affect the overall websits ability to rank? I know that search engines rank webpages not websites but do you think that poor functionality or usability from googles point of view would be detrimental to the rest of the pages?
     
    libertines, Apr 10, 2008 IP
  4. RankSurge

    RankSurge Banned

    Messages:
    273
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #4
    The best way to account for 404 or even 500 errors is to have your sitemap links on your error handling page - that way of the SE gets hung up, it will find all the important links in your site - same with the visitor as well.

    Additionally, you should also program your site to catch and track all errors the site makes so that you can fix any orphan urls or broken links on your site.
     
    RankSurge, Apr 10, 2008 IP
  5. whittier

    whittier Active Member

    Messages:
    43
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    71
    #5
    Unfortunately, the only people that know the answer to your question are not about to tell you...

    I view 404 errors as very serious problems, so careful review of traffic logs is performed no less than once per day (and often hourly during the work day). Every time a page is modified, we validate every link on our site.

    I'm sure that conscientious webmasters wonder about site usability issues such as availability, download speed, spelling and grammar, and their affect on overall search rankings. My guess is that Google does not penalize you, but in the end, there is a big price to pay.

    If you make it a priority to correct the flaws in your website, you won't have to be concerned about Google's related quality assessment criteria.
     
    whittier, Apr 10, 2008 IP
  6. whittier

    whittier Active Member

    Messages:
    43
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    71
    #6
    Rank, you bring up an interesting point about 404 messages and crawlers. I just assumed that once a 404 response was received, the crawler would not attempt to "read" the error message. (Did I understand you correctly?)

    For visitors, a good 404 message is essential. Most websites do not provide helpful error messages. Here is one that still needs to be improved but is better than the default that we all know and hate:

    http://www.iaps.com/exc/wrong-page.html

    How did you implement this solution?
     
    whittier, Apr 10, 2008 IP