I have .js file that is 132.35 KB in size, but it works pretty quick (especially in FireFox 3), is there a size limit where things start to slow down or is dependent on what the javascript is doing? I've seen examples of over 500kb....just wondering! ED
The biggest issue will probably be with the time it takes to download it. Any issues the size causes in the browser once it's downloaded will probably be dependent on what the code is actually doing.
It's a "bad word filter" for a form and it has alot of bad words and phrases in the "var bad_words_arr" section.....7000+ Ed
Why you use AJAX then? Or much better, allow the user to enter any rubbish that he wants, and while inserting the message into the DB, filter it!
Yeah, I'm not sold on doing that client side either if it's that many... Especially since you would have a server side fallback for if they turn off Javascript, right? Between the extra handshake for the .js, reading it from the server, and the data transfer time, I'm willing to bet in most cases that would be longer than it would take to just execute it server-side in the first place. Especially since with that many words false positives would be hell... Nothing like protecting the kids by having to explain why Magna Cum Laude is censored, or what's wrong with the names Dick Van Dyke or Dick Van Patten.
False positives have been a big issue for me with this project, just too many words with multiple meanings. Even Google safe search filters out the word "chicks". Better to be SAFE than sorry! Here's a previous thread where you can test it Ed
Have you tried gzipping that 135Kb? It might help cut down on load time. My CMS uses more than 150Kb of JS, but it's all loaded in stages. The first stage wont be more than 100K or so..
Once I gzip it, is there anything more I need to do other than <script language="javascript" src="filter.js.gz"></script> I couldn't get it to work like this.
I dont know what you created filter.js.gz with.. For gzipping to work you need to tell PHP to spew out gzip-headers (IF the browser supports it) and then gzip the content in php; $e = $_SERVER["HTTP_ACCEPT_ENCODING"]; if(headers_sent()){ $encoding = false; }elseif( strpos($e, 'x-gzip') !== false ){ $encoding = 'x-gzip'; }elseif( strpos($e,'gzip') !== false ){ $encoding = 'gzip'; }else{ $encoding = false; } if($encoding ){ header('Content-Encoding: '.$encoding); $gzipHeader = "\x1f\x8b\x08\x00\x00\x00\x00\x00"; echo $gzipHeader.gzcompress ($content, 9); die(); //u cant output plain text anymore because that will break the gzipped datastream's checksum } Code (markup): If you got a lot of javascript to load, it makes sense to do that in stages, imo. The foundation for staged loading is a php script that takes (js-)library-IDs, maps those to relative paths into your software's distribution, concatenates javascript for serval libIDs together, optionally minifies/obfusicates and gzips them, and optionally (in !development_mode) stores them (plaintext and gzcompressed content) in a cache directory for later fast access.