Page Speed and Google Ranking

Discussion in 'Google' started by jackburton, Mar 4, 2005.

  1. #1
    Just curious. Does the speed of your site's loading time affect how Google indexes/ranks your site in anyway? I just recently made some extensive changes to the site -- basically doubling the speed that you'll need to download the site (added more fancy schmancy graphics, tables, etc.) Anyone know if the speed with which a site loads affects how it's spidered/indexed/ranked in their listings?
     
    jackburton, Mar 4, 2005 IP
  2. T0PS3O

    T0PS3O Feel Good PLC

    Messages:
    13,219
    Likes Received:
    777
    Best Answers:
    0
    Trophy Points:
    0
    #2
    I have an amazon script which takes up to 13 seconds to load (3 meg cable - server side script is the bottleneck). It's indexed just fine. I doubt load time is a variable in the algo for ranking. Else Wikipedia would be at the rock bottom.
     
    T0PS3O, Mar 4, 2005 IP
  3. jackburton

    jackburton Peon

    Messages:
    51
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Good to know. Since Google makes up a large portion of my visitors (word of mouth is the other half), I'd hate to accidently lost half of my visitors in one fell swoop.
     
    jackburton, Mar 4, 2005 IP
  4. CiscoODI

    CiscoODI Peon

    Messages:
    32
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I know that Alexa ranks sites by download time and google bases some stuff on them, but I don't think that google bases rank on download time.
     
    CiscoODI, Mar 6, 2005 IP
  5. T0PS3O

    T0PS3O Feel Good PLC

    Messages:
    13,219
    Likes Received:
    777
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Really? Like what?
     
    T0PS3O, Mar 6, 2005 IP
  6. kewler

    kewler Peon

    Messages:
    480
    Likes Received:
    19
    Best Answers:
    0
    Trophy Points:
    0
    #6
    yes - me too please
     
    kewler, Mar 6, 2005 IP
  7. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #7
    You too what?
     
    minstrel, Mar 6, 2005 IP
    T0PS3O likes this.
  8. BlueFusionX

    BlueFusionX Guest

    Messages:
    16
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #8
    It shouldn't.
     
    BlueFusionX, Mar 6, 2005 IP
  9. noppid

    noppid gunnin' for the quota

    Messages:
    4,246
    Likes Received:
    232
    Best Answers:
    0
    Trophy Points:
    135
    #9
    Turn images off in your browser and see what speed the page loads at. I'd say a bot is also viewing your page with images off.
     
    noppid, Mar 6, 2005 IP
  10. inthedark

    inthedark Peon

    Messages:
    54
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #10
    I would think that page size and load time are 2 of the 100 factors taken into account...people like fast pages
     
    inthedark, Mar 6, 2005 IP
  11. daamsie

    daamsie Peon

    Messages:
    237
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I thought they just didn't take into account any text past the 100k mark. Other than that, I doubt size has any influence.

    As far as SEO goes, it would still be best to tell your story in 10 pages as opposed to 1 - that gives you ten times as many opportunities to target specific keyphrases.

    Alexa uses speed in their ranking?? I don't think so. as far as I can tell, their rank is exactly the same as google's - but yes, they do display the load time as part of the page information.
     
    daamsie, Mar 6, 2005 IP
  12. aspotism

    aspotism Peon

    Messages:
    15
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #12
    I've heard it does have something to do with rank but like inthedark said, it's very little.
    No worries.
     
    aspotism, Mar 6, 2005 IP
  13. minstrel

    minstrel Illustrious Member

    Messages:
    15,082
    Likes Received:
    1,243
    Best Answers:
    0
    Trophy Points:
    480
    #13
    For human visitors, yes. For spiders, no. They only read text anyway so they really don't care about the flash, animated gifs, heavy backgrounds, and massive images on the page because they don't "see" them.
     
    minstrel, Mar 6, 2005 IP
  14. inthedark

    inthedark Peon

    Messages:
    54
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #14
    true, but I was thinking more along the lines of googlebot keeping track of how long retrieving X characters of text from a page takes...all else being equal a site that serves up the text faster could be chosen over a slower one...

    that also brings up a pet peeve of mine i'll do a little rant about...the web was designed to be a multimedia experience and "modern" spiders are nothing but text parsers...when will we see the next gen spiders!
     
    inthedark, Mar 6, 2005 IP
  15. daamsie

    daamsie Peon

    Messages:
    237
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #15
    First of all, the web wasn't 'designed' to be a multimedia experience - that's just what it has grown to be. Have a look here if you're interested in the original intentions from 1989.

    Secondly, perhaps you'd care to look into the Google Labs area to see what the modern spiders are up to - ie.. indexing tv shows, etc. and then of course, there's google images, google sms and so on.

    I think there's enough multimedia being indexed already. I'm not sure what your gripe is.

    I think that search engine spiders actually do the net a great service. As the no.1 'blind' users on the internet, they help to keep webmasters from making inaccessible content.
     
    daamsie, Mar 6, 2005 IP
  16. inthedark

    inthedark Peon

    Messages:
    54
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #16
    I guess my gripe is similiar to the gripe people have about Microsoft ... they lock people into 2nd rate technology...by not ranking well if your site uses Flash and Java for example, people continue to create 1990 style pages when we could be much further down the road...I look forward to a day when a search engine will return all relevant files available for a search term...be they text, audio, video, or whatever...my current pc does alot more than display text characters
     
    inthedark, Mar 7, 2005 IP
  17. daamsie

    daamsie Peon

    Messages:
    237
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #17
    Not that I don't understand what you're saying, but you're seeing it all wrong.

    If you are creating a 2nd rate site that doesn't bother to stick to modern accessibility standards (this isn't 1995), then you won't have a chance in Google. BUT, if you bother to stick with the times and create your site according to Standards, the bots will have no problem with your content. And yes, you can create an accessible site that uses Flash if you really must.

    Far from locking us into 2nd rate technology, the spiders are a good reminder of one reason to make your content accessible to all. They're just another blind visitor.

    I'm not even sure I understand what you expect them to do anyway; do you want them to 'look' at an image and somehow magically figure out that it's a tree?? Can you imagine the processing power (and potential errors) involved in doing that for billions of images online?
     
    daamsie, Mar 7, 2005 IP