House Cleaning by Google?

Discussion in 'Search Engine Optimization' started by mcmuney, Sep 16, 2009.

  1. #1
    In the past I had used a free service to create a xml sitemap file, which had a maximum of 500 links. Anyhow, my site had about 22,000 links on Google. Recently, I modified the xml file manually and added nearly 33,000 links (upgrade from 500); however, since then, or it could be a coincidence, the number of links started to dwindle. It's now at 11,500 (exactly half of what it used to be).

    Is this related to the xml sitemap or is Google doing house cleaning?
     
    mcmuney, Sep 16, 2009 IP
  2. Pixelrage

    Pixelrage Peon

    Messages:
    5,083
    Likes Received:
    128
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Was it that xml-sitemaps.com site? I used to use that, too. I can't see how the sitemap would be to blame...Google is probably acting anal.
     
    Pixelrage, Sep 16, 2009 IP
  3. bank

    bank Peon

    Messages:
    185
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Are you joking? Did you seriously add 32,500 links to a sitemap by hand?

    I can do that with one click. To be honest XML sitemaps are a waste of time, they mean zero to your rankings or indexed pages volume.
     
    bank, Sep 16, 2009 IP
  4. Canonical

    Canonical Well-Known Member

    Messages:
    2,223
    Likes Received:
    141
    Best Answers:
    0
    Trophy Points:
    110
    #4
    I doubt it. XML sitemaps are used by Google to "assist" them with crawling your site. The most useful assistance an XML sitemap gives them is the PRIORITY of the URL. Unfortunately, LOTS of people create sitemaps with every URL in the file having the same .5 priority...

    The PRIORITY tells Gooogle, if you're only going to index X of my URLs, this is the order of importance in which I'd like you to chose those X URLs.

    Sitemaps also help Google find URLs that might be hard to reach (or even possibly unreachable) by following links on the site. Perhaps the page is 10 clicks away from the home page and never gets found but it's important. Giving it a high priority could get it indexed where it wouldn't be by just having them crawl your site naturally.
     
    Canonical, Sep 16, 2009 IP
  5. mcmuney

    mcmuney Well-Known Member

    Messages:
    834
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    128
    #5
    markn26: I used xml-sitemaps.com to generate the free sitemap.

    bank: I used Excel, it's wasn't bad. Plus, I created a template for future use. My next step is to automate it :)

    Canonical: Surprisingly, the latest version of the sitemap from the site listed above didn't have PRIORITY of the URL. I thought it was a bit odd, but I figured that they know what they are doing. Could that be the problem?
     
    mcmuney, Sep 16, 2009 IP
  6. bank

    bank Peon

    Messages:
    185
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #6
    After extensive testing, Google virtually ignores the priority partially due to the reason you mentioned people fail to set the priority rate "or" they set every page on their 100k page site as maximum priority thinking it will help.

    Because it's so prone to user error, and only a small portion of sites have it it's not reliable. Google relies on how often the page changes, inbound links to the page, domain authority etc.

    If the page is 10 levels deep and never gets found, the site has architecture issues and/or link equity problems. Also if Google can't find it, chances are users won't either.

    A sitemap won't "fix" those problems, it won't help those pages rank and pull traffic or be more accessible which is the end goal.

    In fact a sitemap will just obscure those problems and make them much harder to diagnose and fix.

    XML sitemaps are pointless, and they almost always cause more problems than they solve. They were an "ok" idea 5 years ago, but it's 2009 now.
     
    bank, Sep 16, 2009 IP