1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Cutting down W3C Errors - Faster Bot Crawls

Discussion in 'Google' started by ttomp13, Aug 10, 2008.

  1. #1
    Having clean HTML can enable search engine bots to crawl
    through your website more efficiently. I bring this up because
    many people using Blogger have had issues over the past
    week or so and I think it could have a little bit to do with W3C
    Validation. Maybe this wasn't the case, but even so, newbies,
    especially Blogger users, are notorious for having horrible HTML
    code. Cleaning up this code will enable search engine bots to
    crawl more efficiently through your site.

    When it comes to Blogger, there is only so much you can
    actually do to cut your html errors. This takes the
    W3 Validation process to a whole 'nother level.

    Let's put it this way, if you have a widget on your blog, you have
    an error. If you do anything outside of the HTML section of Blogger,
    you have an error. If you have 'Entrecard' on your blog, you likely
    have an error. Not filling out those alt image tags? ERROR!

    The only way to be error free is by completely doing everything in
    HTML. You know that section of blogger that allows you to change
    your Fonts and Colors? That's one big error in the system.

    Don't ask me how, but this guy managed to make the home page
    of his blog error free. Of course, he's not using hacks such as the
    "expandable post summary hack," or the "title tag optimization hack."
    Even so, having no errors in Blogger is a daunting task and this guy pulled it off.

    Me? I managed to pull it off also on my test blog. However, if I
    want comments enabled I get errors. Still, not too shabby.

    This is the best I have gotten my errors down to for website A Man'z Blog.
    This of course, is with comments enabled, and also some website hacks included.

    Cutting my errors down from a big 400+ and over 100+ warnings all the way
    to 38 errors and 30 warnings will allow search engine bots to be more efficient
    while crawling my site. Overall, my pages will be indexed quicker and traffic will increase.
    I have read that W3 validation doesn't have a direct impact on PR or SERP.
    However, being indexed faster than normal and being more "liked" by the search engines never hurts.

    Tip: Click here to learn how to cut your errors sometimes and almost in 1/2 in one simple step.
     
    ttomp13, Aug 10, 2008 IP
  2. inet

    inet Well-Known Member

    Messages:
    706
    Likes Received:
    42
    Best Answers:
    0
    Trophy Points:
    163
    #2
    thanks for the heads up, appreciate it
     
    inet, Aug 10, 2008 IP
  3. FREE BET

    FREE BET Peon

    Messages:
    927
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #3
    i completely agree...

    however, i dont think its much needed at the moment as at least in my niche, my competitors who rank top of the SERPS and have tons of errors are still there...its not mandatory but definitely helps
     
    FREE BET, Aug 10, 2008 IP
  4. sunil_gupta20801

    sunil_gupta20801 Active Member

    Messages:
    900
    Likes Received:
    22
    Best Answers:
    0
    Trophy Points:
    60
    #4
    great efforts, i will also do in free time . :)
     
    sunil_gupta20801, Aug 10, 2008 IP
  5. vivid_designs

    vivid_designs Peon

    Messages:
    29
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I agree.

    My website is running XHTML 1.0 STRICT & CSS validated by W3C with zero (0) errors. I have defiantly noticed a slight advantage comparing them to some competitors sites. They get indexed fast and for the most part keep their high position in serps.

    This may be luck on my part or go along with the new web 2.0 structure. I think it's the structure. If you take your time and code your site right, it'll greatly benefit you.

    Let me ask this, has anyone ever went through their website with a screen reader? Try it, that's basically how the site appears to the Google Bot.

    I even went to the extent of formatting the CSS so even broken images and titles would display a certain way. This isn't Black Hat, but some very very clever ways to plan and structure a page.

    Anyone have any thoughts on this?

    - Jeff
     
    vivid_designs, Aug 10, 2008 IP
  6. guidyy

    guidyy Active Member

    Messages:
    574
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    58
    #6
    I've seen somewhere (maybe webpronews) a video interview with Vanessa Fox about sitemeaps, where, at one point, she states that google doesn't care about code, but all they want is good content.
    At the moment having a clean, error free code, is not a google issue, but it sure makes your pages render faster and they are much easier to debug.
    Did you ever wonder why google's code (e.g. adsense) never validate?
     
    guidyy, Aug 11, 2008 IP
  7. adsforum168

    adsforum168 Peon

    Messages:
    1
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    so the bottomline is content content content?
     
    adsforum168, Aug 11, 2008 IP
  8. ttomp13

    ttomp13 Active Member

    Messages:
    813
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    58
    #8
    Oh. Well. Thanks for the update. Can anyone confirm this?
    PS: My template has a lot of errors right now because
    it's not my "Error Free Template"

    My amanzblog site also has more errors because I quickly
    through some things together last night. I'll fix the errors
    when I get back from Nevada on Friday.
     
    ttomp13, Aug 11, 2008 IP
  9. nagarjuna.k

    nagarjuna.k Peon

    Messages:
    12
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    I think wordpress is the better option for validating the w3c statdards.:)
     
    nagarjuna.k, Aug 11, 2008 IP
  10. sweetfunny

    sweetfunny Banned

    Messages:
    5,743
    Likes Received:
    467
    Best Answers:
    0
    Trophy Points:
    0
    #10
    I redone one of my sites about 6 weeks ago, it had 600+ errors and i converted it to Valid xHTML Strict & CSS 2.1 and it made zero difference to it's rankings or crawl rate.

    Every time i have done this it's made no difference, even Matt Cutt's done a video saying it's not something Google takes in to account.
     
    sweetfunny, Aug 11, 2008 IP
  11. guidyy

    guidyy Active Member

    Messages:
    574
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    58
  12. vivid_designs

    vivid_designs Peon

    Messages:
    29
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #12
    That was a great post! Thanks for the link!
    In response to that, I have found this video that may be of some interest :D

    http://videos.webpronews.com/2008/08/29/ses-catching-up-with-vanessa-fox/


     
    vivid_designs, Sep 1, 2008 IP