Which online tools are you using?

Discussion in 'HTML & Website Design' started by Jacotus Brededin, Jan 8, 2009.

  1. #1
    Hi Guys,

    Which online tools are you using when you're working on a site?
    I use 30 different tools like .htpasswd generators, css minifier, javascript minifier, etc...
    What about you? Any good tools you would like to share?

    Thanks,
    JB
     
    Jacotus Brededin, Jan 8, 2009 IP
  2. crath

    crath Well-Known Member

    Messages:
    661
    Likes Received:
    33
    Best Answers:
    0
    Trophy Points:
    100
    #2
    every time i find myself using a tool, i learn how to do whatever it is that its doing for me

    you will find yourself becoming very efficient that way
     
    crath, Jan 8, 2009 IP
  3. cupidsorchard

    cupidsorchard Peon

    Messages:
    24
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    it is not an online tool but chami.com has the best kept secret for a free html editor something i really could not have done without
     
    cupidsorchard, Jan 8, 2009 IP
  4. drhowarddrfine

    drhowarddrfine Peon

    Messages:
    5,428
    Likes Received:
    95
    Best Answers:
    7
    Trophy Points:
    0
    #4
    Exactly.

    30 tools? Sheesh.
     
    drhowarddrfine, Jan 8, 2009 IP
  5. brutskiy

    brutskiy Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Use your favorite html editor, Filezilla, and Webmaster Tools plugin for FireFox. Also check out HotScripts.com and DynamicDrive.com (tools.dynamicdrive.com)
     
    brutskiy, Jan 8, 2009 IP
  6. Stomme poes

    Stomme poes Peon

    Messages:
    3,195
    Likes Received:
    136
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Online tools I use:

    -the HTML and CSS validators (yeah, I know FF and some others have their own, but I've also heard of them missing stuff, and I always have Internet connection) from the w3c

    -I test contrast with GrayBit
    -I test colours for contrast and discernability for colourblindness at Visicheck (knowing both of these are beta and not totally complete or free of bugs yet)
    -I use that polish guy's Em calculator when I'm given a px size and I want to have a better starting point when trying out em sizes. You can NOT convert from px to em, but this site uses JS to I think see your resolution and default font size and gives you the best ballpark you can get. Meaning your testing in all browsers that the size is right goes faster.
    -I'm too lazy to crawl through the specs when I just want a quick lookup, so I check out Florida State University's tag reference even though it is XHTML1.1 because their program is heavily misguided. So far I haven't found any differences between their XHTML1.1 and the HTML4.01 I write in
    -I use Wikipedia's Character Entity Reference page for all my unicode/ascii needs. Since some of my pages are (pretend) XHTML I might as well follow the rule that XML only recognises five of those "named" character entities (one of which HTML doesn't recognise) and so the rest should be in some ascii form, so I just do that by default for all character entities except the four HTML and XML share.

    Everything else is a helpful bookmark for me, reminding me of stuff.
     
    Stomme poes, Jan 9, 2009 IP
  7. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #7
    Would you do javascript minification by hand??? That doesn't make much sense to me...
     
    Jacotus Brededin, Jan 9, 2009 IP
  8. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #8
    Yeap, about 30 tools.

    I could use offline versions but have to update your tools regularly which is quite a pain...
     
    Jacotus Brededin, Jan 9, 2009 IP
  9. CerIs

    CerIs Active Member

    Messages:
    69
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    58
    #9
    validator.w3.org from w3schools is essential for valid xhtml
     
    CerIs, Jan 9, 2009 IP
  10. sampathsl

    sampathsl Guest

    Messages:
    861
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #10
    I can do most of my works with FireFox pluggins and html editors such as Dreamviewer.
     
    sampathsl, Jan 9, 2009 IP
  11. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #11
    As Crath and DrHowardDrFine noted, overreliance on tools generally reduces productivity as compared to learning how to really do it... A LOT of the tools listed so far are just a bad idea, as if you NEED something like css compressors or javascript compressors, A> your server isn't well optimized (go inline compression), B> Congratulations you just made MORE work for yourself should you need to debug since you are now maintaining multiple copies of the same file.

    Others are quite interesting - like GreyBit... especially since it appears to use that same color conversion formula I'm always talking about (though they forgot to perform level adjustment). Since I know the formula though I don't need that as I automatically use colors of the proper contrast.

    As to online tools I use? It's a short list:

    http://validator.w3.org/
    http://jigsaw.w3.org/css-validator/
    http://validator.w3.org/feed/
    http://www.cynthiasays.com (for Section 508 and WCAG checks)

    and that's about it.
     
    deathshadow, Jan 9, 2009 IP
  12. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #12
    Again, I disagree. If you have a lot of traffic and want to your website to be as responsive as possible, you have to minify css and javascript. Have you ever tried to use YSlow?

    There's an excellent article in "Comunication of the ACM" on optimising website for speed. You can see a bit of the article here if you're not a subscriber.
     
    Jacotus Brededin, Jan 9, 2009 IP
  13. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #13
    I couldn't live without my firefox plugins: Firebug, YSlow, SearchStatus, etc...
     
    Jacotus Brededin, Jan 9, 2009 IP
  14. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #14
    There's a video on Steve Souder website of his Stanford class "High Performance Web Sites" that his article for CACM is based on. For those who don't know who Steve Souder is, he's the guy that was in charge of performance at Yahoo! and he's now doing the same kind of things for Google.
     
    Jacotus Brededin, Jan 9, 2009 IP
  15. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #15
    If you have enough CSS and javascript to need to minify or where inline compression will actually generate REAL CPU load (I usually max out network pipe LONG before CPU - what's chewing your time? 10 to 1 it's IOWAIT) you probably have too much javascript and CSS. Though on that page you linked to it mentions Gzipped compression as item #4, exactly what I was talking about instead of wasting time on your 'minify' type nonsense... Which is odd he then talks about minify type stuff since it's REDUNDANT. White space stripping and code simplification/obfuscation will NOT make much impact in the final gzipped size - the only place you might see a boost is in the initial disk access to the file, but frankly if the difference ends up being 20 bytes MAYBE? That's probably not even breaking a block boundary.

    But then I consider the upper limit total for javascript on a page to be 10k uncompressed unless I'm making a full blown web application... and have much the same opinion of CSS. I see pages using 60-256k of Javascript all the time, looking at the pages of the site going "and they're using javascript FOR WHAT?!?"

    Usually it's fat bloated rubbish frameworks for 'gee ain't neat' bullshit like jquery or mootools being milked for complete accesability /FAIL/ nonsense. See the new hotmail (as described in my 'rubbish code' blog entry) which is buggy, slow, bloated, is a complete accessability /FAIL/ from top to bottom, and can't even be ****ing navigated like a NORMAL WEBSITE (too much AJAX breaks forward/back) - much less the rubbish broken WYSIWYG bullshit editor that doesn't even let you enter plain text properly. (even when you select plain text)

    Oh, and Yslow is not an online tool, it's an extension.... That frankly does nothing for me since I don't need firebug on my own code (only really using it when working on complete crap written by other people) apart from the stats view, which frankly I get a more useful version from out of the web developer toolbar's "Information > Document Size"

    Of course, if you want REAL bandwidth savings without turning your code into incomprehensable gibberish, try image recombination techniques and file count reduction - which can often reduce server load AND speed page load times even when you don't change the total filesizes.
     
    deathshadow, Jan 9, 2009 IP
  16. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #16
    I'm not advocating minification instead of compressin, image recombination and file count reduction. You need to do everything.

    I understand your point of view but sometime you have to do with frameworks. Did you know that Facebook use 1088k of JavaScript? Even Google search use 15k of JavaScript.

    Anyway, I don't want to start a religious war. Typically, that the kind of online tools that I use. I also use html / css / xhtml validators, favicon converters, readability tests, colorblindness tests, etc...

    I know that I could use offline versions of most of those tools but I'm a bit lazy on that side and I work on a laptop half of the time.

    Also, I forgot to mention that I work on a mac, so launching windows utilities is a bit of a pain as I have to start the windows virtual machine (which is actually still faster that starting up my PC).
     
    Jacotus Brededin, Jan 9, 2009 IP
  17. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,999
    Best Answers:
    253
    Trophy Points:
    515
    #17
    Whereas I consider frameworks to be a blight upon the internet, anathema to semantic markup, and an incredibly STUPID way of doing things in what is effectively an INTERPRETED language / set of languages.

    I believe you mean 306k served as 96k compressed by gzip, which is the comedy since they white-space strip some of it, line strip some of it, when frankly I'd be suprised to see that save them more than 150 bytes per VISITOR. Wasted effort just to make the code impossible to read meaning they end up having to maintain two different versions - development and deployment. ALWAYS fun when trying to debug a problem the live code is giving you when the line numbers don't even line up.

    Scripts: 1 file, 2kb, 6kb uncompressed. You MIGHT be able to argue an additional 1.5k on their index page from the inlined scripting - that frankly they'd see bigger savings if they weren't INLINING THEY SCRIPTS AND CSS. Search results page doesn't even have their damned scripts as sepeate files, so no clue where you got that 15k number, unless of course 15k of that 23k uncompressed search results is javascript, in which case that's made of even bigger /FAIL/. They would save more bandwidth by moving those inlined elements out of the markup than any of that white-space stripping is saving them, and probably make maintennance and development easier to boot!
     
    deathshadow, Jan 9, 2009 IP
  18. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #18
    Point taken but for me it's about the user experience more than saving bandwidth. If my page load 0.2 second faster, I'll go through the inconvenience as my user always goes first.

    For Google, I'm sure that 150 bytes per request is significant. How many queries do they get? 10 billions a day? That's about 10 gigs of bandwidth per byte saved. So your 150 bytes ends up being 1.5TB of bandwidth a day...
     
    Jacotus Brededin, Jan 9, 2009 IP
  19. Jacotus Brededin

    Jacotus Brededin Well-Known Member

    Messages:
    361
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    110
    #19
    validator.w3.org has to be top 1 for everybody. I wonder how much wight google put on valid xhtml in calculating PR.
     
    Jacotus Brededin, Jan 9, 2009 IP
  20. greenexit

    greenexit Peon

    Messages:
    37
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #20
    I like using the Lorem Ipsum generator sometimes.
     
    greenexit, Jan 10, 2009 IP