1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Compatibility with w3c standards

Discussion in 'Search Engine Optimization' started by nicram, Dec 9, 2005.

  1. #1
    Is position in search engines depends on it's compatibility with w3c standards?
     
    nicram, Dec 9, 2005 IP
  2. Las Vegas Homes

    Las Vegas Homes Guest

    Messages:
    793
    Likes Received:
    59
    Best Answers:
    0
    Trophy Points:
    0
    #2
    IMO I do not believe that it does. I know of several top sites that do not validate, but I will say it doesnt hurt if it does. Most of the sites in the real estate industry do not validate. Most of these sites are created with Frontpage and Dreamweaver. It is difficult to get these html programs to validate.
     
    Las Vegas Homes, Dec 9, 2005 IP
  3. nicram

    nicram Peon

    Messages:
    38
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I enhance my question: If you know that site is compatibile with standards - do you treat it better?
     
    nicram, Dec 9, 2005 IP
  4. daredashi

    daredashi Well-Known Member

    Messages:
    667
    Likes Received:
    31
    Best Answers:
    0
    Trophy Points:
    120
    #4
    yep definately.
    search engin get better access to your content and site stucture. this is why blogs (mostly all blogs are w3c complient) have better rankings.
    if tow sites have same backlinks, same weight and same SERP criterian then one with w3c validated will out stand with one which is don't.
     
    daredashi, Dec 9, 2005 IP
  5. INV

    INV Peon

    Messages:
    1,686
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #5
    W3C isn't a factor at all. Its not true that a w3c correct website will outrank one that isnt in a even senario. As long as the page is spiderable it's all fine. The bots ignore the tables and many things such as css and look for the content data such as text, images, ect.. You have to remember that the engine is after the best information and thats what its used for. If it finds a website that has information and factors fit to be better then the other it doesnt care if the page is w3c formatted correctly.
     
    INV, Dec 9, 2005 IP
  6. nicram

    nicram Peon

    Messages:
    38
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Don't you think if site owner takes care about standards, he takes care about his site better? Maybe search engines "think" so too?

    I think it is good criteria - if I have two sites in the same "level" of content, links, etc I give those which is standards compiliant higher in search result. Don't you have any experience about it? Did your site goes up after make it standard compatibile?
     
    nicram, Dec 9, 2005 IP
  7. piernik

    piernik Peon

    Messages:
    18
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    I thnik exacly the same inv.
    Can't have terrible mistakes like forgetting about head or body tags, but smaller ones ("/>" instead of ">") - give me a break ;)
     
    piernik, Dec 9, 2005 IP
  8. INV

    INV Peon

    Messages:
    1,686
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Again its about revent information found on the site not w3c standards.
     
    INV, Dec 9, 2005 IP
  9. palespyder

    palespyder Psycho Ninja

    Messages:
    1,254
    Likes Received:
    98
    Best Answers:
    0
    Trophy Points:
    168
    #9
    I wish I could share your optimism INV, I personally believe the SE's would prefer W3C compliance for a few reasons, first and foremost being HTML errors, second being page size and load time. A well designed W3C complaint page will load much faster then a bloated HTML page filled with tables. Though I don't believe compliance in and of itself is a factor, I DO believe that SE's look at factors such as errors in the code and such, because truly they are looking for the best possible information as you stated above, but, don't you think they would also be looking for a page that doesn't have errors in the HTML code for best fit to a user?

    This sounds like I am arguing, but, I am actually trying to get input, man I hate type ;)
     
    palespyder, Dec 9, 2005 IP
  10. wrmineo

    wrmineo Peon

    Messages:
    3,087
    Likes Received:
    379
    Best Answers:
    0
    Trophy Points:
    0
    #10
    I have to agree/disagree palespyder.

    As Minstrel has painstakenly attempted to point out in another thread, there is a difference between valid code and W3C compliant code.

    Code can be valid in that there are no unclosed tags, broken links, duplicated anchors texts etc. but not valid by W3C standards for something as simply as not having a proper DOCTYPE or ENCODING. Google is a great example of valid code but not W3C valid IMO.

    However, having said all that, there can be such a minute difference between the two, I can't see having valid code and not taking the extra step of making W3C valid.

    Finally, valid code and W3C validated pages do work best IMO for the simple fact that they will get indexed without errors, faster, cleaner et al, so yes, I think they could have a "leg up" in rankings as a result.
     
    wrmineo, Dec 9, 2005 IP
    daredashi likes this.
  11. palespyder

    palespyder Psycho Ninja

    Messages:
    1,254
    Likes Received:
    98
    Best Answers:
    0
    Trophy Points:
    168
    #11
    wrmineo,
    But, do you not agree that nested tables bloat the code uncontrollably? I mean I have see sites that used up to 6 nested tables, I agree that so long as the code is valid, not problem, me saying W3C compliance is just a "general" statement and I should watch those ;) SO read my previous statements as Valid HTML, I say W3C compliance because you can check for all forms of valid code there ;)
     
    palespyder, Dec 9, 2005 IP
    wrmineo likes this.
  12. wrmineo

    wrmineo Peon

    Messages:
    3,087
    Likes Received:
    379
    Best Answers:
    0
    Trophy Points:
    0
    #12
    palespyder I wholeheartedly concur with everything you're saying, and a definite yes on the nested tables too, especially when done to excess.

    I wasn't picking apart your "general" statement so much as clarifying as a precurser to the potential of onslaught of naysayers ... ;)
     
    wrmineo, Dec 9, 2005 IP
  13. palespyder

    palespyder Psycho Ninja

    Messages:
    1,254
    Likes Received:
    98
    Best Answers:
    0
    Trophy Points:
    168
    #13
    Oh no not at all, I am glad it was pointed out, I am really bad about making general statements like that and then wishing I could retract them, no harm no foul ;)
     
    palespyder, Dec 9, 2005 IP
  14. INV

    INV Peon

    Messages:
    1,686
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #14
    Thanks PaleSpyder for the second opinion. My reply would of been similar to wrmineo's in this case if I would of posted before him. I should of made it clear what I was talking about and explained the difference between a valid HTML code and a compliant w3c. There are many ocassions where the HTML code itself is acctualy valid and would be perfect for a spider to get the info with no problem however there is occasional style tags or background tags that w3c doesn't find valid for the associated page encoding. Which in my experience don't prove to show any ties with search results. I can even argue that nested tables to a certain limit aren't a problem for the spider to index. The spider I believe indexes the way the code is formatted from top to bottom and when searching for info I am positive it wouldn't even notice the amount of nested tables as it would be filtered.

    Cheers, sorry for the rather striking replies I usually have no a days. Many obstacles pop up of no where lately which gives me less time to write proper replies.
     
    INV, Dec 9, 2005 IP
    palespyder likes this.
  15. sufyaaan

    sufyaaan Banned

    Messages:
    218
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #15
    It does NOT depend on W3C validation standards as yet. Might become a factor in ranking in the future. But G won't tell you when they include it in their algo. So better make your code W3C competent today. :p
     
    sufyaaan, Dec 9, 2005 IP
  16. stuw

    stuw Peon

    Messages:
    702
    Likes Received:
    44
    Best Answers:
    0
    Trophy Points:
    0
    #16
    I'm working towards making all my new sites w3c complient. I don't think at the moment it helps with SEO. But I do think it helps with load times, ensuring that my html is valid and h1 etc tags are in the right order. I am now trying to make all my sites tableless. I find the firefox developer plugin invalubale. I use it to turn off the css and see if my pages make sense with no styles applied - this gives me a good idea how a screen reader would tackle the page, and also a search engine?
     
    stuw, Dec 9, 2005 IP
  17. dougadam

    dougadam Active Member

    Messages:
    495
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    58
    #17
    dougadam, Dec 10, 2005 IP
  18. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #18
    I agree. I think that modern spiders are a lot better at finding content and zooming past "structure" and "layout", nested tables or anything else. The old emphasis on getting all your content to the top of the code may (or may not) have been critical at one time - I'm not convinced it makes much difference any more.
     
    minstrel, Dec 11, 2005 IP