1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Is google sitemap getting lazy?

Discussion in 'Google Sitemaps' started by Johnburk, Oct 2, 2005.

  1. #1
    I have been using google sitemaps since that start. Up to a week ago it would download my submitted sites withing 2 to 6 hours.
    SEMrush
    Lately it seems to be really slow and still pending sitemaps from 2-3 days ago.

    Does anyone else have this problem?
     
    Johnburk, Oct 2, 2005 IP
    SEMrush
  2. dnthcaseme

    dnthcaseme Peon

    Messages:
    7
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    yes i also have faced this problem it takes long time to download
    Imran Hashmi
     
    dnthcaseme, Oct 2, 2005 IP
  3. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Yes, they are taking a bit longer. Google site maps is a great concept, but I see that it could definately create some programming and resource issues at Google.

    • If people say that the lastmod date is always the current date, and the content has not changed... will Google train itself about this, and so learn not to trust that particlar site map?
    • If the update frequency is set at say hourly, and the content changes say 6mthly, will Google go back to its default around monthly/update frequency based on Google PR?
    • If Google always does what the submitted site map says, it would be possibly coming back quite a few more times. Is it possible for Google to have a resourcing issue?
    Interesting.
     
    t2dman, Oct 2, 2005 IP
  4. softplus

    softplus Peon

    Messages:
    79
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I think this behaviour is "normal" and more or less to be expected. "In the beginning" (always wanted to write a sentance starting with that) the webmasters using Google Sitemaps were a pretty small group, in the meantime they're being used all over the place (probably temporarily peaked, for various reasons). That uses a lot of ressources and even Google has to share what it's got among the whole crowd. The idea is great, but I think the biggest challenge Google is facing is Google + Sitemaps themselves: Google wants a clean index, tries to keep "spamy" and similar sites out, but Sitemaps works the opposite way, tries to get as many sites fully indexed as possible. I bet they're working hard on a solution to this, it can't be as easy as the Google-Sitemaps idea in the first place :). And when sites Google classifies as "spamy" (if they are or not) get delisted because of Google Sitemaps, you can bet there will be lots of "sitemaps = bad" in the forums (as there is now already).

    Things to consider:
    - really good links get the job done faster (at the moment at least).
    - a *bad* sitemap file *can* cause problems (eg. duplicate content issues...)
    - change frequency + prioirity are *not* being used at the moment (according to Google)
    - last change date for active files (php, asp, etc - no not ".etc") should not be based on server response headers (which will say it was "now"). Check out http://gsitecrawler.com/articles/meta-tag-date.asp .
    - last change date is not authoritive, for the above reason and because many people set+forget their sitemap file instead of updating it regularly, so Google still crawls the site normally
    - sitemap files are add-only: it doesn't matter if you don't list all your URLs, they won't be punished, removed, etc. -> similarly, you can't use sitemaps to remove your old, obsolete URLs on Google.
    - Google Sitemaps is in beta and is constantly being updated - check your "My Sitemaps" regularly for changes, things for YOU to do, etc.

    There's probably more, but I must have lost it while sneezing just a moment ago :)
     
    softplus, Oct 2, 2005 IP
  5. t2dman

    t2dman Peon

    Messages:
    242
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I have very successfully used the Google sitemap to remove old url's from Googles index.

    If you list the old url's in there, and have 301's to the new url's, you can very successfully remove the old url's from Googles index.

    But great post. Google sitemaps is certainly developing well.

    I like the newish feature that says what url's it has not added. I occasionally see a few where the server has been down/busy and so Google has not been able to spider an otherwise working url.

    It also helps me know how well my robots.txt has been working - since it lists those url's in there as well.
     
    t2dman, Oct 2, 2005 IP
  6. obrusoft

    obrusoft Peon

    Messages:
    79
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I think they're working hard to develop sitemaps better
    time i s money , and they must be improove their services
     
    obrusoft, Oct 6, 2005 IP