List for Faster Web Page Methods Make fewer HTTP requests - Reducing 304’s with Cache-Control Headers Use a CDN Add an Expires header - Caching with mod_expires on Apache Gzip components Put CSS at the top Move JS to the bottom Avoid CSS expressions Make JS and CSS external Reduce DNS lookups - Use Static IP address, use a subdomain for static content. Minify JS - Refactor the code, compress with dojo Avoid redirects - Use internal redirection with mod_rewrite, The correct way to redirect with 301 Remove duplicate scripts Turn off ETags - In htaccess: FileETag None Header unset ETag Code (markup): Make AJAX cacheable and small Sources & References book These rules are the key to speeding up your web pages Image maps How do CSS Sprites work Speedy Sites with Image Sprites and CSS embed (small) media type data directly inline JSMin is a filter which removes comments and unnecessary whitespace from JavaScript files tool to reduce the size, and therefore latency, of JavaScript by browsers Status Codes for HTTP All 57 HTTP Status Codes A graphical tool that enables Web content providers to rapidly and accurately measure client side performance of Web pages Speed up Firefox HTTP Debugging Tool Debug the web! Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests Part 2: Browser Cache Usage - Exposed! Part 3: When the Cookie Crumbles Part 4: Maximizing Parallel Downloads in the Carpool Lane The Importance of Front-End Performance Rule 1 - Make Fewer HTTP Requests
I bet a $500 BSD/linux server configured with the above tips it would be noticeably faster than a site on an unoptimized $20,000 non-apache server. Add the hosting, DNS, Static IP's, bandwidth, etc., and you could do it for around a cool $300/yr. $800 vs. $20,000 plus who knows what kind of huge amount of money companies and corporations waste every year for technology product/packaged "solutions" instead of know-how and best-practices. Still its tough to learn everything mentioned in the list, but its worth it when your sites are noticeably-almost instantaneously quick.
Could you clarify this more? Do you mean mod_gzip? Because IMHO this is for reducing bandwidth usage, not for speeding things up. In certain scenarios (lots of data which is highly comrpessable) I guess it would speed things up, but usually it slows things down. This is because of the overhead involved: your server has to gzip the page before sending it, and then the users machine has to decompress it. Granted, the cpu load for the user will be unnoticable, but not so on the server if you're serving lots of requests. Your other tips though seem to be quite useful, so maybe I'm mistaken on the above, would like to hear your input on it.
gzip lowers bandwidth usage: It compresses the transfers, thus less bandwidth usage on owner's server but since the bandwidth is less, the size of page is less as well. Thus, the page is automatically speeds up!
Right but you're forgetting the performance overhead from actually compressing the page before transfer. Compressing a page takes CPU time, on both the client and server. Its negligible on the client side but on the server side where hundreds of pages are being compressed a second, it would really slow things down, afaik.
Fix it by using Linux You may possibly have been looking for a more serious answer but hey, its still a perfectly valid solution
gzip is done on the server side it will actually increase server load however it will decrease the time it takes your users to download the page and save you some bandwidth. if your forum is slow you need to look in to php caching methods apc is great http://forums.digitalpoint.com/showthread.php?t=352387 or configuring apache, php or removing phpbb plugins that may be effecting your server load.
Well, for me. My customers and users are far more important than CPU usage. So, a little high load wont be a problem. Also, if you use single processor, you should not host more than 150 sites at max. On a Quad core 600 is the max. So, assume the CPU Load according to that. My servers have load of 0.05 - 0.75 of max. 2 Other one with gzip enabled with same number of sites hosted has max. load of 1.25 of max. 4.
No problem.. these make such a huge noticeable difference.. I wish the whole net was as fast as my site.
I don't agree. Before quad-cores and even HT's there was the plain old Xeon and Pentiums. Plenty of hosts handled hundreds of sites on a single CPU. I personally have what's now considered ancient P4 3.2ghz HT BSD server that's been running for 5 years. I practice good security and maintain my server with updates. 5+ years and it dishes out 500+ GB of bandwidth per month and hosts about 50 of my most active sites with no problems. Load is normally under .3. Most people over do it on the processor and don't spend enough on optimizing the server. I would like to add 1 extra tip. Make sure to run a php accelerator like eaccelerator. It can certainly speed up the site and do wonders with server load.
I agree Rect, at first I was all about minimizing the traffic, but at this point I usually opt to disable gzip compression server-side and pass on dynamically generated or database-driven content in favor of static when possible. In 5 years or so the bandwidth won't be an issue, and processing speeds are also doing what they do. Still, the experience from speeding up your site and servers is invaluable.