Hello, I am updating a website but seems that it was made with less css I don't know if this has to do anything with it (I dont use less.css)but when I open the css file on sublime text the code is displayed in a single line wich makes the editing a lil bit difficult, is there a way to display this code vertically? Thanks
It's probably "minified". Just paste your code on this website, it will output it "unminified". http://unminify.com On a side note, if the css file is so bloated that it needed to be minified, chances are there's something wrong with it.
Something I've been saying for over a decade now. Same goes for the markup. Minification is typically a desperation move by people not qualified to be writing HTML, CSS or JavaScript to try and make their fat bloated disasters load 1/100th of a second faster -- because 1/100th of a second makes such a huge difference in a 30 second PLUS download time. MOST ALWAYS when minification (also known as whitespace stripping) is used, it's because the developer was using hundreds of K to do dozens of K's job. 500k of JS to do 18k's job... 256k of CSS to do 16k's job, 96k of HTML to do 16k's job... I'd laugh if it weren't so pathetic... oh what the hell, I'll laugh anyways. Also beware that just because it validates doesn't necessarily mean it's right -- validation doesn't check heading orders or document structure. It's a good first step but... A GREAT way to tell if the page is rubbish? Open it in a browser, hit ctrl-A to select all, CTRL-C to copy, then paste it into a flat text editor like sublime. How much CONTENT is there. Manually count how many CONTENT images, objects or videos are present (that's content, NOT presentation like theme/template images as those have NO business in the HTML), and then plug them into this formula: 3K + (text size * 1.5) + (images/objects/videos * 256 bytes) If the markup (html file) is larger than what that formula tells you, the page is probably poorly coded rubbish. For example if there's 4k of plaintext and 8 content images, that formula would say the HTML should be 11k. That's the generous version of said math, the ideal is 1.5k + text * 1.5 + images * 192b You go much past those numbers, and it's just developer ineptitude in action -- in that same way for MOST websites regardless of how fancy they are, there's usually little reason for there to be more than 48k of CSS per file per media target FOR THE ENTIRE SITE. Most sites realistically should have 24k of less CSS which is why "frameworks" like bootcrap is just halfwit nonsense... since using hundreds of K of CSS in the framework, to write twice or more the HTML AND even more CSS to then add a reliance upon scripttardery as well... and this is somehow magically EASIER? Severe Montoyaism right there. You should also load up the web developer toolbar for FF (a must-have tool) to check under information -> document outline as that will tell you if the headers (if any) make any sense, and also check if the page is still usable to users if you disable images or block CSS... A well written page should "gracefully degrade" with those things disabled... ... and pretty much if they resorted to minification, you can be relatively certain it's a fat bloated pig built on ignorance. ON TOPIC -- For getting rid of minification one GREAT tool is also built into Firefox -- the new document inspector (that makes firebug obsolete). If you pull it up (right click and choose "inspect element" or hit "F12") and go into "style editor" it will show you the CSS unminfiied -- with a vertical tab-list of the files included on the page. Inside each of the tree-style tabs is a save option that saves the file unminified instead of in it's original format. Even cooler you can edit it right there so if it's just minor changes you can test them there. Just load the page, go into style editor, choose the file you want, and you can save it with the tabs and carriage returns all nice and formatted for you. NORMALLY I'm against live editing and prefer to have it in a editor separate from the browser, but often times with websites that are inaccessible messes that I need to create USER.CSS so I can even TRY to put up with them it's become a very handy tool to determine what values I need to change to drag websites (including these forums) kicking and screaming into the light.
Regardless if the site was made with less or sass or something similar, it has nothing to do with the code being minified (not automatically, at least) - most likely someone has minified the output after they finished the site. Usually not needed, but it doesn't hurt either, so I see nothing wrong in minifying the included files - as long as you have the original codefiles on your computer, available for changes and edits.
Take care, though. That process adds another point of failure due to inadvertent(?) forking. The maintainer may not have the original source, and may not have access (working offline, e.g.) to a de-minifying service. It also adds a layer of expense to maintenance and debugging costs, which on an active business site, the annual costs can easily exceed 1.5 to 2 times the original development cost without adding adding unneeded speed bumps. cheers, gary
That's really most of the problem with it right there -- the savings aren't usually enough if the code is written worth a damn to risk pissing off the client; particularly if they aren't provided the original working sources which shouldn't happen, but sadly is all too often the case. It just makes the next poor slob trying to work with it have to work harder. I mean, a PROPERLY written website with say... 12k of markup might see MAYBE 2k of savings, while around 14k of CSS shouldn't even see 1.5k of reduction... is it REALLY worth 3.5k per page -- not even meeting the packet round-ups for most files -- just to make it harder to maintain? SURE, if you have a fat bloated poorly written 100k train wreck you might see as much as 60k savings, but that's more just an indication of bad development practices than it is the usefulness of said tools. The better written the code is, the less useful minification/whitespace stripping becomes. You have to ask is that benefit of a few k of bandwidth actually worth making the site harder to maintain. Which of course is exactly the problem @bussw83 is now butting heads with. Working on a site someone else built, that's been crapped on with minification, without the original source materials it was built from. Though at least this is CSS we're talking about -- due to how spaces between elements can change the rendering of HTML, you run the risk on some (poorly coded) layouts of undoing minification actually breaking the layout. That can be a REAL headache. Slightly related to other headaches like poor comment placement tripping rendering bugs, script tag placements being treated as block-level breaks in some builds of FF, and a host of other things that really SHOULDN'T happen with HTML, but do thanks to crappy browser implementations. See how: <li>Test</li><li>Test</li> can actually render differently from: <li>Test</li> <li>Test</li> Depending on what you throw at it for CSS. First time I came across that some ten to fifteen years ago, I was banging my head against the wall for a week.
In an ideal world that would be great but saddly many customers come with websites made by someone else.... but thankfully the page still editable
Do you think that is the reason they do it to have the customer coming back to them to fix those bugs ?
BINGO! That's something I've suspected for a LONG time about a number of practices I see people doing out there. The all-time classic of "vendor lock-in". Make it so complex and so hard to maintain and so hard to move, they have no choice but to come back to you "hat in hand".
I'm not sure that's the case. I don't believe that the vast majority of "developers" that use such a method are smart/aware enough for it to be purposeful. They've simply bought the the unicorn farts without considering the consequences. cheers, gary
If that were true Apple would have gone under decades ago. Vendor lock in is a VERY common practice. Sometimes companies even claim that they are fighting vendor lock-in while creating it. You can even see an example of this in HTML 5 with the VIDEO tag... It leaves us at the whims and mercies of what the browser makers FEEL like implementing so if something new and better comes along, if they won't implement it, OH WELL. The exact OPPOSITE of fighting vendor lock-in. They call it fighting vendor lock-in because of "the all encompassing evil known as Flash" and it having almost reached near monopoly status. Which it did through a universal interface (the OBJECT tag) created to allow everyone to compete on an open footing. That open environment killed off the media format wars of windows media vs. quicktime vs. Realplayer and crowned a new king. Suddenly we're back to the WORST of the format wars needing no less than three (ideally five) codec/container combinations to deploy the same video content, and people are dumb enough to yum it up like the best thing since sliced bread? Colour me unimpressed. Vendor lock-in by the folks who LOST the format wars over a ago in the name of fighting the 'vendor lock-in" of a format that came to be through a interface designed to fight vendor lock-in. If you can make sense of that or how it's any form of an 'improvement', you're doing better than me.
Well, that's true of a LOT of what people do in building websites. HTML, CSS and JS "frameworks" all immediately come to mind... as do things like "grids", fixed width layouts, fixed-height backgrounds, declaring fonts in pixel sizes, and the whole gamut of other ignorant practices that are so commonplace.
Not sure what static pages has to do with any of that... typically when people say static pages they mean flat HTML with no CMS behind it, which can be done either way. ... and if you mean static layout, that's what I was raging against.
Exactly. I call pages with a fixed layout static websites, but that is just how I call them personally since they keep the same size, I see a layout as a webpage