Optimizing for Faster Loading

Discussion in 'HTML & Website Design' started by Mr.Dog, Dec 13, 2012.

  1. #1
    Hi,

    I'm working on optimizing multiple sites for faster loading... the thing is that one of the smallest sites has one of the worst loading visually speaking, but the GTMetrics.com and WhichLoadsFast.com sites give my site a very good score.

    What I see is only the images lag behind and takes 3-5 seconds for the background and a small (about 400px wide) main page image to both display entirely.

    Both are very simple, one is .GIF, other .JPEG. The entire site is very simple, but still loads slowly...

    For that simple site it loads too sluggishly, while I can see tons of small blogs and other sites displaying far more and more complex images faster.

    So I am digging deeper than normally to see what else I could do.

    So here are several questions that I have...

    1) Should I set the sizes of images in HTML? (width, height... otherwise, it anyway displays automatically)...

    2) I am using this doctype... not sure if this has any influence on speed (guess not or does it?):
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">

    3) GTMetrics.com finds "404 error" for the favicon... as far as I know, favicon is not needed, so why this problem?

    4) GTMetrics tells me to make less HTML requests - how do I do this?

    5) It also tells me to "use a content delivery network" (CDN) >>> why, if it's such a simple website?

    6) "There are 15 components with misconfigured ETags" >>> what does this mean?

    7) "Leverage browser caching" >>> means what?

    8) "Use efficient CSS selectors" >>> how?

    9) "Specify a character set early"

    Anything I could do to speed up that site's images?

    I over-optimized them, it's basically 1 small, 4 very small images and a patter background of very low res.

    This is just one of many sites I am currently re-optimizing. It's a headache to do this work and I can't seem to make the images load faster.
     
    Mr.Dog, Dec 13, 2012 IP
  2. Rukbat

    Rukbat Well-Known Member

    Messages:
    2,908
    Likes Received:
    37
    Best Answers:
    51
    Trophy Points:
    125
    #2
    No, set them in the source. If you want a 384X200 image, make it 384X200, then upload that file to the server.

    Don't use a transitional doctype. That said, don't use XHTML unless you specifically need it - use an HTML doctype.

    I don't know GTMetrics - they may just check for a favicon even if you have no code specifying one. (If you do specify one, and it's missing, that's a missing file.)

    That depends on your code. No one can tell you without seeing it.

    If you're using content that you can get from a CDN, use one. (For example, if you're including jQuery.) Why? Because anyone who has already cached that file will load it from cache, not from your site. It saves traffic and it's faster (and it reduces the number of HTML requests.)

    That's a server thing. Unless you run the server you can;t do anything about it, AFAIK.

    Things like using a CDN.

    By learning CSS - yours is probably not as efficient as it could be.

    You should have a tag like <meta http-equiv="Content-Type" content="text/html; charset=utf-8">

    The smaller the image file, the faster they'll load.
     
    Rukbat, Dec 13, 2012 IP
  3. trickolla

    trickolla Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    1
    #3
    one more thing use less plug ins they also slow down loading speed
     
    trickolla, Dec 13, 2012 IP
  4. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #4
    First off, those types of "tools" often just turn you into the tool. They are unreliable, filled with nonsensical misinformation, and in my experience pack you so full of sand you could change your name to Sahara.

    Over the next week or so I'll be adding a "So what's wrong with Google Page Speed and YSlow" (which those sites you linked to simply wrap) to my list of What's wrong with articles for that very reason.

    This was quite evident (Was it you I was talking to via PM?) where comparing a 1 megabyte site made of 50 separate files that even their own systems said took 11.5 seconds (takes around 30 seconds here) got a B, while a site of mine built from 22 files in 67k that loads by their own waterfall chart in 1.1 seconds (loads as fast as you go to it here) got a D... Meaning their own speed rating has NOTHIGN TO DO WITH THE SPEED OF THE SITES!?!?!

    So you want to take those tools with... well, I was gonna say a grain of salt, but to be honest you might want to visit Bonneville Utah.

    Keep in mind that image loads may also be delayed by SCRIPTS, excessive CSS, excessively complex markup resulting in a harder to parse DOM -- they can all add up to delays. (part of why I say if you need more than 32k of javascript for a 'normal' website, you're pissing on it!)

    The only thing that effects is when the HTML loads the images sizes will be placed so a 'reflow' isn't neccessary. That's a rendering issue, NOT a site speed issue -- unless of course you have needlessly convoluted markup that makes reflows painfully slow to the point it effects the download. (rare occurance at best).

    Transitional LITERALLY MEANS "In transition from 1997 to 1998 coding practices". Given that we are within spitting distance of 2013, well... That should give you your answer right there!

    Unlike Rukbat, I prefer XHTML 1.0 -- so keep that. It's cleaner, clearer with better defined and consistent structural rules... rules in specifications and things like guidelines are a good thing no matter what the trannies and 5-tards who never pulled their heads out of 1997's arse seem to think, as they give you a guide to make it EASIER to do, and make less mistakes!

    It will fill up your server log with 404 errors as even if you don't state one in your document, some browsers (IE, Opera) will still look for the file.

    WORSE, 404's aren't cached, so it can actually consume MORE bandwidth not having one due to the extra file requests and the long timeout a 404 can often take, as well as it transmitting the same 404 message back (which is about the same size as a 16 or 256 color ICO file would be anyways!)

    It also helps to have favicons so that when people have you open in a tab, on the taskbar, etc, etc, they can see what site it is quickly. Even better since in some better browsers (Opera) or with extensions you can drag and drop favicons to make quick-launches.

    http://www.cutcodedown.com/images/ewiusb120.jpg

    Notice between the navigation buttons and the address bar the favicons for various websites...

    Use less separate files. Combine multiple javascripts down to single files and/or swing an axe at endless pointless javascript asshattery like anything jquery based. Use image recombination techniques like the incorrectly nameds "CSS Sprites" for things like icons and theme images (anything that doesn't need to tile and isn't a content image)... Combine down multiple CSS files (if any) by MEDIA type, assuming you are even using MEDIA types. (Which I'm shocked how many people are unaware of since it's the FIRST THING YOU SHOULD LEARN ABOUT USING LINK AND CSS!!!)

    That's the BS I have a problem with -- you have to remember that said tools are calling Google Page Speed and ySlow, so you have to view that statement with a bit of skepticism given we're talking about people who get kick-backs from people who sell CDN's and/or sell them themselves!

    Now, if you're loading up on more than 20 separate files on EVERY pageload (not just firstload), then a CDN makes sense -- but to be frank if you are doing that for anything less than a image/video gallery, the site in question is probably half assed poorly coded garbage.

    Ah, yes... eTags, the latest trend in caching micromanagement to cover up a lack of understanding the point of caching.

    The concept of eTags is to have the server add extra information to every request to allow for 'versioning'... what the devil is wrong with checking the 'last modified' timestamp is beyond me, and as such I consider etags to be bloat that slows DOWN pages, since it's redundant, more bandwidth, and worst of all -- MORE CRAP FOR THE SERVER TO HAVE TO DEAL WITH!

    It's another of those "throwing more code at it is not the answer!".

    Honestly, I've started to think they wouldn't know 'proper caching models' from the hole in a Adobe DVD... There's this crazy idiotic idea that the default behavior in browsers for caching is somehow 'broken' -- and just like the previous item they want you to throw more code at everything to make it 'faster'. Specifically they want you to micromanage a bunch of extra response headers for every file at the server level... Expires, Cache-control, last-modified and eTag. Honestly, if there's something wrong with the default for cache-chontrol, expires and last-modified, there's something wrong with the host, not the website.

    WORSE, the values they want you to use just bloats up and fills up your cache, to the point it's not flushing or handling garbage control enough -- this can actually result in not just increased memory use, but increase thrashing and fragmentation of both memory and the hard drive.

    ... and for what, to make it so the user doesn't see a first-load style delay tomorrow if they've not visited enough sites to flush the cache limit? A situation so unlikely said options shouldn't do anything ANYWAYS?!?

    Micromanaging BS that probably works great on fat bloated websites slapped together by idiots who have no business making websites in the first place.

    Their idea of "efficient" is also 100% fantasyland idiocy! Why do I say this? Because what they call an 'efficient' selector is basically slapping a class on everything instead fo leveraging inheritance.

    They're under this weird delusion that the time it takes for the browser to parse selectors is big enough to have an impact on load speed. They say don't use the universal selector, use ID's and classes INSTEAD of tagnames, and don't mix ID's and classes, avoid psuedostates and generated content.

    That's just plain BULL and I suspect the OOCSS whackjobs and the incompetant fools at turdpress who couldn't code their way out of a piss soaked paper bag are behind it. Why? Because that basically just leaves slapping ID's and classes on everything -- THAT'S MORE CODE, more code is SLOWER. PERIOD. Anyone telling you otherwise needs a good swift kick in the JUNK. EVEN when it's the alleged experts at Google.

    While the charset is overridden by whatever the server sends in the response headers, your META to declare what character encoding you are using should be as close to the HEAD opening as possible... I put it RIGHT after HEAD.

    That's why all my documents start out thus:
    
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
    "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
    <html
    	xmlns="http://www.w3.org/1999/xhtml"
    	lang="en"
    	xml:lang="en"
    ><head>
    
    <meta
    	http-equiv="Content-Type"
    	content="text/html; charset=utf-8"
    />
    
    Code (markup):
    There is no reason for non-ascii7 characters (characters 0..127) at or before HEAD, and most all character encodings have that as a lowest common denominator... Just as classes and ID's shouldn't ever use non-ASCII7 chars, and it's illegal to use non-ASCII7 chars in CSS no matter what you tell it with the completely pointless @charset property. (excepting perhaps generated content).

    Of course the HTML 5 ninnies tries to do away with this... forgetting of course that it's there as a fallback for things like local testing (where there are no response headers) or page saves/downloads (where there are no response headers).

    If you've already optimized the images down to say... 100k or less (counting content images that's my cutoff for a page), the problem could in fact lie elsewhere in the page as there's other stuff a browser is stuck downloading when rendering a site. The ORDER your files are loaded in can also have an effect... just moving a javascript from the start of the file to the end right before </body> (like Google Analytics -- NOT that I encourage it's use) can speed up the page load speed.

    Though you've not included a link to the site in question, so we're pretty much all guessing wildly as to your issues... though if it's that oh so familiar looking site in your signature, the 158k of images is a bit excessive, but not enough to explain the delay. If I were to take a guess, I'd say the problem is your host sucks. BAD.

    Though this:
    http://www.sparkjunction.com/welcome-grn-img-001.png

    At 99k is a little ridiculous... as is the IFRAME on the footer (whole separate website/handshake to be loaded!)

    But since that site uses the non-real world deployable non-recommendation XHTML 1.1 doctype, I don't think it's the same site you are referring to. Though that too takes WAY too long to load.
     
    deathshadow, Dec 14, 2012 IP
  5. Mr.Dog

    Mr.Dog Active Member

    Messages:
    912
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #5
    Gosh, there are so many details, I am going to check these out - will take some time to respond to you.

    Deathshadow: I am using the iFrame for the copyright text, as it's sitewide... I know it eats up "some" speed, but I don't think it's all that bad... that tiny thing down there weighs so much?

    Oh, no, that's not the site. I'm managing multiple sites, I am working on 10+ to rebuild, repair, update and bring to today's standards.
    It's taking a lot of time until I go through all sites. Potentially months.

    I am doing my best to implement new things, remove anything that's weighing down.

    But some of those plugins are very important. The commenting systems, the SM bookmarking... Google Analytics couldn't be left out either...

    It's full of dilemmas and headaches!
     
    Mr.Dog, Dec 15, 2012 IP
  6. Mr.Dog

    Mr.Dog Active Member

    Messages:
    912
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #6
    My character set is:
    <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />

    My Notepad++ is setting the Doctype automatically to:
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">


    Now: all my pages are flat .htm pages, some Javascript (especially with those tools), but otherwise pure .htm pages and codes, some external CSS stylesheets etc.

    Do you think there's any problem with these? I'm not sure... Honestly, this is the first time I'm bothering with these.
     
    Mr.Dog, Dec 15, 2012 IP
  7. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #7
    They shouldn't be causing problems -- but it all hinges on the rest of the page.

    That said it's 2013, for the majority of sites there's no excuse for a tranny doctype... Your editor shouldn't be automatically setting a blasted thing.

    if it's English only there's nothing 'wrong' with iso-8859-1, (stick to ASCII7 and it's REALLY a non-issue)... The question is WHERE in the code is it being set? really the content-type meta should be the first thing after <head>, if it isn't, that's why those 'tools' were kvetching about it.

    On the page you are actually talking about, how many files is it and what's the total filesize compressed and uncompressed? Just how much scripting is there? Is it running any funky script compression that might end up having to run before the images do? How often is onload called? How big is the onload chain? Until you link us to exactly what page is having the problem -- the full HTML, the full CSS, all the scripting and all the images, we can't really tell you a lot more than vague recommendations. There could be a laundry list of problems, there could be one simple bottleneck. We don't know because you're treading into "This is why we can't help you" territory.

    Which is why I took a wild guess and assumed the site in your siggy - then dismissed it as it's X1.1 (which shouldn't even be used anywhere)
     
    deathshadow, Dec 15, 2012 IP
  8. Mr.Dog

    Mr.Dog Active Member

    Messages:
    912
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #8
    Until now I had this before the <head>:
    <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />

    It is English, but it also has content/words in French, Spanish, Eastern European languages etc. So it must cover those characters as well.

    I was thinking there might also be an issue with my hoster... not sure. There are no other special scripts, except: Google Analytics, AddThis SM bookmarking (loads 2 JS) and a small iframe for the copyright text.

    index.htm page file size is 13.2 kb.

    There are 3-4 small images, a low-res background, 1 about 400px wide .jpg image. Most are .gif and optimized for 16 colors. Even they load like the "curtain" pulled down.

    I would replace that iframe, but I couldn't find any easier way to have a sitewide copyright text that I could set through a single file and propagate across the site.

    Basically the page loads fast, including CSS buttons, but the background and those few rather obscure images take 2-3-4 seconds. Never faster load than 2 seconds.
     
    Mr.Dog, Dec 15, 2012 IP
  9. Rukbat

    Rukbat Well-Known Member

    Messages:
    2,908
    Likes Received:
    37
    Best Answers:
    51
    Trophy Points:
    125
    #9
    Trivial if you make all your pages PHP files (and change all references to them) and just include the file the text is in (or read it) on every page at the right of the footer (I think that's the text you're referring to).
     
    Rukbat, Dec 15, 2012 IP
  10. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #10
    So invalid markup. It doesn't even belong there, may in fact have been ignored due to that placement.

    In which case you should be looking at UTF-8.

    Entirely possible.

    Addthis is a pig, usually you're better off using a simple FB, twitter and G+ links and saying shtup the rest. Google analytics is IMHO pointless bloat... anything it tells you that has real meaning could just as easily be gleaned from the server logs using analog or webalizer, anything else is just kvetching over minor data points that have little if anything to do with the actual health of the site.

    ... and again, this is spitting distance from 2013, not 1997, lose the iframe.

    PHP is NOT that hard.

    <?php include('theme/footer.html'); ?>

    Done. You could even include ALL elements that are common across pages in that manner.
    
    <?php
    $pageTitle='home';
    $keywords='keywords,specific,to,this,page';
    include('theme/header.php');
    ?>
    Your page specific code goes here
    <?php
    include('theme/footer.html');
    
    Code (markup):
    ... and how big are they? I asked for a size analysis -- like say the output from the web developer extension for FF or a waterfall of it?

    Again, without telling us what site it is or having us look at it, we REALLY can't help you.
    You REALLY don't seem to be getting the hint on that...
     
    deathshadow, Dec 15, 2012 IP
  11. Mr.Dog

    Mr.Dog Active Member

    Messages:
    912
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #11
    Thanks for the helpful tips... gosh, I have to go through a lot of work, but will eventually finish...
     
    Mr.Dog, Jan 11, 2013 IP