1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

197 errors on my website, 42 warnings.

Discussion in 'General Business' started by 24788, Feb 16, 2010.

  1. #1
    My website loads fine and in good speed, but it has a tons of minor errors. Is this something to worry about for fixing?
     
    24788, Feb 16, 2010 IP
  2. automarketting

    automarketting Peon

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Which is your site? what kind of errors are they?
     
    automarketting, Feb 16, 2010 IP
  3. 24788

    24788 Peon

    Messages:
    529
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #3
    missing tags such as < and such.

    it's the top one in my sig. ftf
     
    24788, Feb 16, 2010 IP
  4. dotcomdude

    dotcomdude Active Member

    Messages:
    100
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    68
    #4
    The quick answer is - try running yahoo.com through the same validator. You'll probably find that has more errors than your site does!

    The longer answer - and probably the better one - is that it depends on the kind of errors and warnings. Don't sweat the small stuff - but sort out the more serious.

    If you want to know which ones to fix, maybe post the warning messages on a forum like this and let the community help you decide which ones...
     
    dotcomdude, Feb 17, 2010 IP
  5. SmallPotatoes

    SmallPotatoes Peon

    Messages:
    1,321
    Likes Received:
    41
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Validation errors can mean there are problems with your HTML that will make it difficult for search engines to parse it.

    And, over time, they increase the chance that future versions of web browsers will be unable to display your site correctly.

    Better to fix them early. I don't launch a site unless it validates perfectly.
     
    SmallPotatoes, Feb 17, 2010 IP
  6. chaoscript

    chaoscript Well-Known Member

    Messages:
    3,459
    Likes Received:
    12
    Best Answers:
    0
    Trophy Points:
    150
    #6
    I think that it's better to fix it fast,
    Google like sites that have vaild XHTML/HTML template.

    Cheers.
     
    chaoscript, Feb 17, 2010 IP
  7. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #7
    That might be a wrong concept. How do you think a search bot is supposed to check a website for errors and why would it do that? From my understanding, search bots are there to collect information, not to check for errors. It's a bot, no likes or dislikes :)

    The real issue may be related to page display issues if you don't take proper care of the coding errors.
     
    Clive, Feb 17, 2010 IP
  8. chaoscript

    chaoscript Well-Known Member

    Messages:
    3,459
    Likes Received:
    12
    Best Answers:
    0
    Trophy Points:
    150
    #8
    You right,
    But agree with me that for the bot easy to collect information from page that without errors.

    Cheers.
     
    chaoscript, Feb 17, 2010 IP
  9. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #9
    Some html error types can limit how search engine spiders index a website but I don't think that an extra </tr> closing tag is a huge deal while the W3C validator will still count it as an error.
     
    Clive, Feb 17, 2010 IP
  10. SmallPotatoes

    SmallPotatoes Peon

    Messages:
    1,321
    Likes Received:
    41
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Totally wrong.

    Search indexers are computer programmes. They have no feelings or personality. They follow rules which tell them how to extract content from pages. Those rules are the based on the same specs that validators use. If your page does not validate, then the bot will have trouble reading it for the same reason that the validator didn't like it.

    This is kind of an inane question. If a validator can check for errors, why can't a "search bot"?
     
    SmallPotatoes, Feb 17, 2010 IP
  11. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #11
    Here's another insane question, actually, ask yourself why would a search bot be set up to check for site errors instead of just gathering information. Why would it be sent to your site and eat its bandwidth for doing tasks you haven't requested? I'm rather willing to believe that spiders have been set up wisely to strip most of the non-seo site tags to just index clean lines of content.
     
    Clive, Feb 17, 2010 IP
  12. lakhyajyoti

    lakhyajyoti Member

    Messages:
    57
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    41
    #12
    I have also some errors.They are crawl errors.I don't know how to remove these. Visit my site http://earningdiary.com
     
    lakhyajyoti, Feb 17, 2010 IP
  13. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #13
    Here are some quick fixes:

    Fix for both:

    <img src="http://revtwt.com/images/TwtAd_referral01.jpg" alt="" /></a>
    HTML:
    Fix: remove the following attribute from the <form> tag:
     role="search"
    Code (markup):
    Fix for both:

    …Join earningdiary at MyBloglog!" /></a>
    Code (markup):
    There you go, a few more "crawl errors" left to be fixed ;)
     
    Clive, Feb 18, 2010 IP
  14. tech_savvy

    tech_savvy Peon

    Messages:
    435
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #14
    better to attend to the glitches right away.
     
    tech_savvy, Feb 18, 2010 IP
  15. SmallPotatoes

    SmallPotatoes Peon

    Messages:
    1,321
    Likes Received:
    41
    Best Answers:
    0
    Trophy Points:
    0
    #15
    Think about it. Imagine how you would write a parser or structured lexical analyser. If the input doesn't follow the prescribed model then it becomes much more difficult to work with.

    A validator is basically just the part of a search indexer that processes a single page, except that it's set to output the errors back to you instead of quietly failing and moving on to the next page.

    P.S. "inane" != "insane".
     
    SmallPotatoes, Feb 21, 2010 IP
  16. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #16
    You don't mean that Google Search cares about validation, do you?
    I actually believe I care more but I was talking about something else in my post.

    Why would Google force websites to follow a certain standard in order to get indexed if they don't follow it themselves?

    Think about it :)
     
    Clive, Feb 21, 2010 IP
  17. sarahk

    sarahk iTamer Staff

    Messages:
    28,500
    Likes Received:
    4,460
    Best Answers:
    123
    Trophy Points:
    665
    #17
    It won't but if it can't read your page because the machine doesn't understand what you have on it then you have problems.

    I wouldnt' worry about alt tags but I would worry about tags not being complete, missing, mismatched.
     
    sarahk, Feb 21, 2010 IP
  18. Clive

    Clive Web Developer

    Messages:
    4,507
    Likes Received:
    297
    Best Answers:
    0
    Trophy Points:
    250
    #18
    True, Sarah.

    As funny as it may sound in this context, I am a coding perfectionit myself, and all my XHTML/CSS work validates as a standard deal. Customers don't even need to point that out to me, I am a huge fan of clean coding.

    What I don't like is the slightly paranoic approach to validation, with stuff like "google bot will not index a page if it doesn't validate" which is a misleading concept. Anyone can do a simple search and then run the top ranking results through the W3C validation tool. You'll be surprised to see how many fail the test, yet are on top. The conclusion?

    If browsers can read a page then bots should be able to crawl it, too.
     
    Clive, Feb 22, 2010 IP
  19. sarahk

    sarahk iTamer Staff

    Messages:
    28,500
    Likes Received:
    4,460
    Best Answers:
    123
    Trophy Points:
    665
    #19
    I'm with you. Yesterday I was trying to suss out why a block of code looked good in all browsers but IE7. Turned out we have a div where a li was meant to be. It'll get indexed ok, viewed by most browsers but because the code isn't right we have a problem.
     
    sarahk, Feb 22, 2010 IP
  20. HuggyStudios

    HuggyStudios Well-Known Member

    Messages:
    724
    Likes Received:
    20
    Best Answers:
    26
    Trophy Points:
    165
    #20
    Valid code does not effect search engine rankings in my opinon, some people made some pretty valid points about the reasons. You should write valid code as a normal standard if you are making a website for a paying client. They would not be happy with anything else.
     
    HuggyStudios, Feb 22, 2010 IP
    Clive likes this.