1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Google Page Speed and Images

Discussion in 'Google' started by twalters84, Apr 15, 2010.

  1. #1
    Greetings,

    For some reason in webmaster tools under page speed, it is saying that my website is taking 13 seconds on average to load a webpage.

    I am thinking that this might be because of webpages like this: Green Products

    On this particular webpage, there are a total of 30 product images loading. It is a nice way to find products especially on a faster internet connection.

    If I did not display images when web crawlers accessed these types of pages, would Google see my site as being faster? Would doing so be against the Google webmaster rules? I know I read something once where you are not suppose to show content different when a crawler accesses a page to when a regular user accesses one. However, not displaying images might make the user experience more enjoyable. What are your thoughts on this?

    Thanks in advance for any advice or suggestions.

    Sincerely,
    Travis Walters
     
    twalters84, Apr 15, 2010 IP
  2. Sxperm

    Sxperm Notable Member

    Messages:
    4,386
    Likes Received:
    142
    Best Answers:
    0
    Trophy Points:
    225
    #2
    I think you should check current images file format first. If you choose the proper format then it would have a much smaller file size while remain quality.

    Photo type and similar - jpg
    Cartoon and similar - gif / png
    Portrait and similar - png
     
    Sxperm, Apr 15, 2010 IP
  3. China_girl

    China_girl Well-Known Member

    Messages:
    4,096
    Likes Received:
    72
    Best Answers:
    0
    Trophy Points:
    195
    #3
    I dont think travis that might be the problem. I am sure you will have an idea by sxperm's comment. Better try to get all the images in one category and a fixed thumbnail size. There are lot of sites where the images are very tiny with specific extension for different categories of images.
     
    China_girl, Apr 15, 2010 IP
  4. twalters84

    twalters84 Peon

    Messages:
    514
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Hey there,

    Thanks for the responses.

    I realize some images are a lot larger than they are being served.

    The big problem is that there are about 60 million products and the majority of the images are on third party servers. I have scripts running downloading photos and resizing them but this might take quite a long time.

    One of my thoughts was hiding them in the search results might not be a big deal because the spider would still see the images on the details pages (the page after the search results).

    Sincerely,
    Travis Walters
     
    twalters84, Apr 15, 2010 IP
  5. vagrant

    vagrant Peon

    Messages:
    2,284
    Likes Received:
    181
    Best Answers:
    0
    Trophy Points:
    0
    #5
    There are a lot of different elements on your pages from different domains, meaning the bot has to wait for DNS look-up's for those domains etc.
    Try to reduce the number of different DNS look-up's if possible by serving what you can from your server.

    The page took several seconds to START to display on my browser and i have a 50Mb connection, and the browser status bar showed a lot of waiting for different elements of the page.

    img/tagline.gif NOT found
    img/bg_left-nav.gif NOT found

    Some browsers (and bots) do not handle 404's for an img the same as a html file they tend to wait to time out, (also some types of servers do not send a 404 for an img depending on setup) best to fix those errors just in case.

    if you run your page through http://www.websiteoptimization.com/services/analyze/
    you will see warnings such as
    "The total size of external your scripts is 109665 bytes, which is over 20K" <- that is a lot for external files
    "The total size of your images is 376706 bytes, which is over 100K"
    "The total size of this page is 584107 bytes, which will load in 125.41 seconds on a 56Kbps modem. Consider reducing total page size to less than 100K"
    "The total number of images on this page is 39 , consider reducing this to a more reasonable number"
    "The total number of objects on this page is 45 which by their number will dominate web page delay."

    As for googlebot etc ... the external images from barnsnoble etc... server them from a local url that then redirets to barnsnoble
    then just block them in robots txt ...

    Example .. if barnsnobe images come from externaldomain.com/xyx/xy.gif
    set your script to serve from yourdomain/xyx/xy.gif and set that sub directory to redirect to the external domain

    in robots txt
    Disallow: /xyz/
    Disallow: *.jpg
    Disallow: *.gif
    and they will not wait for them to respond.

    vagrant
     
    Last edited: Apr 15, 2010
    vagrant, Apr 15, 2010 IP
  6. checkblog

    checkblog Well-Known Member

    Messages:
    1,212
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    128
    #6
    I don't think images will affect SERP because bot can't read images without alt.
     
    checkblog, Apr 15, 2010 IP
  7. akhdiyat

    akhdiyat Member

    Messages:
    14
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #7
    Yes you're right
     
    akhdiyat, Apr 15, 2010 IP
  8. vagrant

    vagrant Peon

    Messages:
    2,284
    Likes Received:
    181
    Best Answers:
    0
    Trophy Points:
    0
    #8
    the normal "indexing" gbot does not nor does it phrase css, java etc .... but the bot that is used to check page speed does load/read graphics, ccs, java etc, but only a couple of pages every few weeks so my server logs show. Thats how they pick up cloacked pages, malware, i-frames as well :)
     
    Last edited: Apr 15, 2010
    vagrant, Apr 15, 2010 IP
  9. twalters84

    twalters84 Peon

    Messages:
    514
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Hey there,

    Thanks for the great responses.

    This amount includes the scriptaculous library (49 KB GZIP compressed), a siteScript.js file for common javascript functions(23 KB uncompressed), a zapatec drop-menu scripts (36 KB compressed). For users, they would only ever have to download these files once - after that it should be cached. I do not think it would be possible to get these scripts under the recommended size unless somebody tore the third party scripts apart and took out just what my website needed.

    When the page first loads, all images are hidden unless the body onload function executes a resize script. Would it be better to make a div where the image should be and then have some sort of AJAX call instead of using hidden placeholders?

    Good idea on the robots.txt file and images.

    Sincerely,
    Travis Walters
     
    twalters84, Apr 15, 2010 IP
    vagrant likes this.
  10. whitespparow

    whitespparow Peon

    Messages:
    464
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #10
    there are many tools to check the page speed and you can try to reduce the size of images. Also try to concentrate on the CSS and Js files.
     
    whitespparow, Apr 15, 2010 IP
  11. vagrant

    vagrant Peon

    Messages:
    2,284
    Likes Received:
    181
    Best Answers:
    0
    Trophy Points:
    0
    #11
    sorry i could only spot SOME of the possible problems... that does not mean i know the answers tho :(

    don't know the zapatec drop-menu .... i tend to use the Ruthsarian Menus as they work without java and just css (better seo wise as well), MUCH smaller in size, where as yours vanish without java in FF on my computer.

    that is OK for users but not bots :(
    but IS one reason google will only ever put a very LOW serps boost value on download speed that will not matter for MOST web sites IMHO.

    Your other questions i could only have a vague guess at so wont comment.
     
    vagrant, Apr 15, 2010 IP
  12. twalters84

    twalters84 Peon

    Messages:
    514
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #12
    Hey there,

    Is there anyway to emulate lower connection speeds?

    Is there anyway to tell what connection speed a user has through javascript, coldfusion, etc?

    I would assume most people have migrated away from dial-up connections? What is the lowest connection speed that I should be concerned about? I feel like the more you care about lower connection speeds the less optimized a website can be towards faster connections; faster connections would not get to experience the web to its highest potential.

    If anybody is interested, I am willing to pay for assistance to increase overall page speed. Please email me at and we can get a project setup. I am also willing to work through rentacoder.

    Sincerely,
    Travis Walters
     
    twalters84, Apr 16, 2010 IP