Troubleshooting Search Engine Problems

Discussion in 'Search Engine Optimization' started by eDom.co.uk, Dec 19, 2006.

  1. #1
    eDom.co.uk, Dec 19, 2006 IP
  2. hooperman

    hooperman Well-Known Member

    Messages:
    646
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    108
    #2
    Rubbish
    Fine for other reasons, but unless there are major flaws in your code, having valid pages won't help in SEO.
    To know that 'alarm bells' go off you would either have to have been told by the Google Guys, or have access to the algorithm's code. You might be right, but you should prefix your opinion with "I think that"
    I see no reason why increased use of CSS would have any SEO benefit. Why would it? It doesn't make the page's content any more 'relevant' or higher quality.
    It may be your friend, but all it does is exclude bots (that obey it) from certain areas.


    I like the Christmas theme the site has though :)
     
    hooperman, Dec 19, 2006 IP
  3. DSeguin

    DSeguin Peon

    Messages:
    70
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Almost all websites do not have valid code, yet many display fine in most browsers. Spiders don't necessarily see it the way other browsers might, and what might look like a minor flaw in code can prevent spiders from accessing content on your page.

    Using CSS not only cuts out a lot of reused code, but it can be used to flexibly place content on your website, while at the same time keeping that same content near the top of your source where spiders like it a lot more and will give your site better quality rank.


    Spiders still crawl where they're told not to go, they just don't index it.
     
    DSeguin, Dec 19, 2006 IP
  4. hooperman

    hooperman Well-Known Member

    Messages:
    646
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    108
    #4
    Do you have an example of a minor flaw in code that prevents spiders from accessing content on a page that I could look at?

    How do you know that keeping content near the top of a page gives you a better rank. To know that you would either have to have seen the algorithms that search engines use, or have tested your theory thoroughly. I doubt that you've seen the algorithms, so any chance I could see your test data?
     
    hooperman, Dec 20, 2006 IP
  5. webado

    webado Peon

    Messages:
    2
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Hi there.

    I am the one who wrote the guide eDom.co.uk posted the link to.

    I am not going to argue with you guys any of this stuff. I'll just point out that nowhere do I mention SEO - I am talking about logical, technical ways to solve search engine trouble, given that voodoo is largely overrated.

    I don't specifically call the items in my checklist SEO because I'd hope SEO is what you do after the site is perfect from all technical viewpoints. This is called site debugging and it should be a prerequisite to actual SEO. I don't do SEO. I debug.


    I assume you are not much involved with the Google Webmaster Tools group.
    If you were and paid attention, you'd have seen that the verification meta tag offered by Google to those who want to submit sitemaps is invalid for all doctypes. That in itself appears as a very minor flaw except it can be very nasty.

    In particular for a non-xhtml doctype (or when no doctype is specified) the meta tag which is presented as closed with a /> has the effect of cutting the head of the document right there, and the rest of the page is skipped right up until </body> when it resumes parsing to merrily find the </html>. I am talking about Googlebot here, when it crawls in what I call 'structured" mode. This seems to totally jive with the multitude of sites which suddenly went belly up after adding this verification meta tag. The homepage might have remained in the index, but all other pages either never got indexed (for a brand new site) or saw their other pages drop into supplemental immediately. Fix this meta tag, the site's back to normal. Leave it broken, the rest of the site consists of orphan pages, as their connection to the homepage has been severed through the robto's inability to discover the rest of the page that has the links.

    There are so many such cases that it has to be due to this broken meta tag. They also get back to normal soon after fixing this meta tag. I said normal - I didn't say they miraculously become a leading site. There are obviously many other factors involved. But this is one extraordinarily silly little bug that has such an big effect.

    Of course in many cases those sites also have numerous other validation problems, because otherwise their webmasters would have caught the blooper and fixed it themselves way before publishing their pages.

    And it's not just that meta tag. It's any other meta tag or link tag appearing in the head of a non-xhtml dpctype that's incorrectly closed that way.

    These sites all went south very soon, within days of adding this meta tag when in the sitemap program.

    My explanation and I will not be dissuaded on this because it makes sense, is that the Googlebot that crawls sites that participate in the sitemaps program is programmed to parse pages syntactically, not just read them as text, under assumption of a certain degree of correctness of the syntax, especially with respect to block level code. It will assume correct structure so that it can better index. But this assumption makes it fail when the structure is broken. It acts just like a compiler.

    The argument that many, most sites are invalid, doesn't hold any water when we are talking about the new breed of robots like the Googlebot that's been deployed now in conjunction with sitemaps. Once a site gets crawled by this bot all bets are off and the site gets considered on its own merits at that moment. If it's broken, it will suffer the consequences.

    It's totally possible that eventually Google will modify yet again the Googlebot program to overlook such syntax errors - then maybe this won't be happening any more. But it would be funny because then it will have lost the very edge it gained by being so strict. They may do that in order to placate the people who are disillusioned in the negative effect they experienced under this sitemaps program.

    But then it's like asking a long-time practitioner of some profession, like a physician for instance, to take a new professional accreditation exam and only discover that he's in fact failing because the requirements are much stricter than way back when he first got his diplomas. What a blow!

    If you guys don't want to believe, that's OK. I follow my logic and act accordingly and it works for me.

    I don't expect to convert any of the non-believers. I only dropped by because I found a backlink to my site originating here and it intrigued me.
     
    webado, Jan 17, 2007 IP
  6. webado

    webado Peon

    Messages:
    2
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    If you've ever used the w3c validator you've certainly encountered errors such as this one:

    That's pretty much what I'm talking about.
     
    webado, Jan 18, 2007 IP