1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Bizarre Search Engine Penalties

Discussion in 'Search Engine Optimization' started by crichey, Dec 4, 2006.

  1. #1
    Recently it seems that search engines are handing out penalties left right and center, and some of these penalties are pretty weird.
    A website that uses the software provided by the company I work for appears to be one a good example of a site that is suffering from a strange penalty.
    The site is http://www.fraserleonard.ca/ it has a PR of 5 and does pretty standard SEO stuff, quality title, meta-tags etc. with a number of quality incomming links.
    However when you search for fraserleonard.ca on Google.com no fraserleonard.ca pages appear in the first couple page of search results. I thought the site could be suffering from the Google minus-30 penalty but for some google searches the site still has what appears to be normal search results rankings.
    The website's SEO consultant thinks it has to do with the sites architecture. I am skeptical of this as a number of other sites have the same architecture as provided by the software and this is the only one that appears to be suffering from a penalty. It is probably important to note that the site does do that well in search results of any of the major search engines.
    Any ideas, suggestions, theories, or feedback on the reason for this possible penalty would be greatly appreciated.
     
    crichey, Dec 4, 2006 IP
  2. DSeguin

    DSeguin Peon

    Messages:
    70
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #2
    I'm not sure how exactly spiders crawl or if it affects your site in any way, but after viewing your source code there appears to be a ton of blank space and indents for no purpose whatsoever. As soon as I view the source code all I see is white space, no code, it is placed further down. There is also another huge blank space half way through as well. Maybe it doesn't affect spiders at all, just wondering.
     
    DSeguin, Dec 4, 2006 IP
  3. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #3
    by default line 1 starts with
    <!DOCTYPE html PUBLIC ....

    on your mentioned site all starts with plenty of blank lines

    also your code needs emergency care by a professional webmaster rather than a SEO consultant
    see
    http://validator.w3.org/check?verbose=1&uri=http://www.fraserleonard.ca/

    without clean code any SEO may be fully lost depending on the type of errors built into a site.
    some of the errors may well have a dextructive impact on crawlers and thus on SERPS
     
    hans, Dec 4, 2006 IP
  4. crichey

    crichey Peon

    Messages:
    19
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thank you for taking the time to look at what is going on here.
    You are right the code has some issues for sure, but a number of other sites the use this same code are able to do quite well in the search engine results. Could it be that this site just happened to be picked up by the search engines filters and the others will likely get penalized in the future?
    An example of another site that uses this same code is:
    http://www.occasionallygifted.com
     
    crichey, Dec 4, 2006 IP
  5. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #5
    there is no penalty from any SE for invalid code
    invalid code is suicide by the site owner ...
    anyone killing a bot is comitting suicide and needs no further penalty at all
    SE have a right to expect correct and clean code and to work top efficiently SE have no choice to build in error-workarounds like large browsers do
    each site owner knows the rules
    comparing to other invalid sites and avoiding clean work be using other sites as standard is no assurance for success
    W3.org is the only standard to look UP to and compare a site with if success is a serious goal in business
    all others have to accept failure and be happy with it
     
    hans, Dec 4, 2006 IP