1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

I created new site http://sampathdesign.com/ could you tell me your thoughts

Discussion in 'HTML & Website Design' started by sampath1, Mar 24, 2015.

  1. #1
    Hi,

    I have designed new site to sell small business sites and graphics, could you tell me your thoughts, improvements etc.

    Site URL: http://sampathdesign.com/

    Thanks
     
    sampath1, Mar 24, 2015 IP
  2. PoPSiCLe

    PoPSiCLe Illustrious Member

    Messages:
    4,623
    Likes Received:
    725
    Best Answers:
    152
    Trophy Points:
    470
    #2
    Seems okay - tested quickly with different resolutions, seems decent enough - also, works fine without javascript enabled (no real problems). It's built on Wordpress, so the code isn't super in any way, but it doesn't effect the behavior any, as far as I can see, nor does it detrimentally effect speed. All in all, not too bad :)
     
    PoPSiCLe, Mar 25, 2015 IP
  3. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #3
    Painfully and agonizingly slow loading here (greater than 50 seconds), illegible fixed metric fonts send me diving for the zoom, and illegible colour contrasts on some elements (like those green prices on the textured grey are likely invisible to more than half the population). Massive space wasting banner just makes it harder to get at what's important and is a waste of time and bandwidth.

    It's 68 separate files in 2.8 megabytes, which is a significant part of why it's so painfully slow -- that almost 900k of that and 26 of those files is scripttardery for christmas only knows what typically means either developer ineptitude for who wrote that scripting, or blindly pasting together code in the hope it will work, not caring if the end result is the least bit accessible or useful to users. This is also shown by the ELEVEN separate CSS files when there's not even any apparent proper use of media targets (aka the job of ONE stylesheet) coming to over 200k of style -- which for such a simple website is TEN TIMES the amount of code that should have been used. Handshakes ALONE the file count should take anywhere from a 5 second minimum to a one minute maximum, with 12 seconds being a "real world" average.... and there's NO reason for it.

    ... About the only good thing I can say is it's only 25k of markup, which while likely TWICE what should have been used, is far, far leaner than I'm used to seeing from off the shelf solutions slapped together any old way. the CSS and the scripttardery are the real things pissing all over the page.

    ... and that's before I pop the bonnet and look at the code which, well... It's turdpress. Making it a STUNNING example of why I cannot fathom how ANYONE would choose to use that steaming pile of manure by choice. It's the typical train wreck of "I cans haz intarnets" one can expect from turdpress, with the pointless static style in the markup, static scripting in the markup, and endless pointless idiotic "let's throw classes at EVERYTHING" asshattery that TP developers so love smearing all over the carpets.

    The end result is EXACTLY what I'm always talking about with turdpress and it's templates since as a USER of websites, the result would have me being an instant bounce due to it's accessibility failings and agonizing load times, and on the whole is why I'd likely toss that entire mess in the trash and start over from scratch. It would likely cost clients visitors and conversions, in the long term being a dead-end money pit.

    STUNNING example of everything WRONG with web development today. I pity anyone foolish enough to deploy that on a real website.

    -- edit -- oh, and it appears that none of the code on that page is tripping caching models, the performance analysis in FF is identical between cached and first-load. You really should look into that as it's bad enough it's probably three to ten times larger in code base than it should be and wastes 68 files doing 16 files job -- without it not even leveraging cache properly.
     
    deathshadow, Mar 25, 2015 IP
  4. PoPSiCLe

    PoPSiCLe Illustrious Member

    Messages:
    4,623
    Likes Received:
    725
    Best Answers:
    152
    Trophy Points:
    470
    #4
    @deathshadow what kind of crappy Internet connection are you on? Seriously? Loading it here (from Norway), it takes about 2-3 seconds on my broadband (25/5) - and about the same time loading it over 3G on my cell-phone. Setting the phone to 2G / Edge, it still loaded in less than 10 seconds. How in the world of all that isn't holy can it load in 50 seconds? If you're on a analog modem, I get it, but... seriously?
     
    PoPSiCLe, Mar 25, 2015 IP
  5. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #5
    RIGHT this minute, I'm seeing roughly 200 to 240 ms ping time to their server, since handshakes overlap to an extent you divide by three, multiply by four, subtract 8 for the "free connections" if you aren't connection starved, and you get a current first-load speed at 11PM EST from NH of 20 sec on a slow test, 16 seconds on a fast test. Earlier the connection limit was slightly starved with two people watching netflix and/or hulu on the same connection, hence it taking three to four times that.

    Again that's just the nature of raw connection math. Take your phone down to McDonalds, Panera Bread or some other public location where lots of people are REALLY sharing the connection and watch 1 SECOND per handhake past the first 8 become the norm. One second per handshake there's a minute.

    That's why 68 separate files is for a LOT of the free world effectively USELESS that NOBODY is going to wait for it to finish loading; the laugh being I'm on what's a fast connection here; most of my neighbors paying through the nose for either 15mbps down with 768kbps up on cable, or the same nosebleed price for 1.5m/512k on DSL. You go fifty miles north of my location and you're looking at places where it's $100 a month for 1.5/768 satellite or paying for two phone lines to shotgun dialup.

    NOT everyone in first world nations have access to the over-the-top holy hannah broadband that is commonplace in higher population density areas or places that joined the party late... You might have great ping and access times in Norway, but can the same be said of Northern New Hampshire, Vermont, Western Maine or almost two-thirds the non-coastal American Northwest? Much less our friends in Canada or Australia who are seeing things like connection shaping and bandwidth caps reduce them to two connections at a time and a tenth the throughput they are paying for?

    That's why you CANNOT trust how fast it is FOR YOU, why you cannot trust how fast it is for tools like google pagespeed... you HAVE to do the math.

    I'm not getting loads as slow as earlier (duh, middle of the night now), but really this:
    [​IMG]

    Is why having five or six dozen separate files on a page that from everything I'm seeing there's NO excuse other than "I don't know what I'm doing" to have more than 20 files is unacceptable and "Not viable for web deployment"

    Again, the math... average expected handshaking time for a 68 file page -- REGARDLESS of connection speed -- is 12 seconds. In practice it can run anywhere from 5 seconds to multiple minutes.

    A concept lost on the "just slap everything together any old way" crowd and why basically such "template" nonsense is a giant SCAM sleazed together by people who have NO damned business making websites in the first place.

    The scripttardery and CSS ineptitude in particular being why around a third to half the web is now less useful to me as a user of websites than it was a decade ago.

    So yay, you happen to be lucky enough to be sitting on top of their server and live in a country with connection speeds that 90%+ of the United States, Canada and Australia would kill for. Good for you.

    Though I would call bullshit on your 2-3 seconds figure, that would mean a ping time to their server of "You're on the same LAN".

    Allegedly I've got 45/1, and I can reach those 45 speeds down on single files -- but that means JACK ***** if it's 15+ hops with endless routing delays too/from wherever this is hosted. That's WHY YOU KEEP TRACK Of FILE COUNTS and WHY YOU DON'T USE 26 SEPARATE SCRIPT FILES AND 13 SEPARATE CSS FILES when you only have ONE media target!!!

    It's stuff like this that is why services like OnLive might work in LA, Chicago or New York, but failed to even be playable in 5/6ths the United States due to input lag.
     
    Last edited: Mar 25, 2015
    deathshadow, Mar 25, 2015 IP
  6. COBOLdinosaur

    COBOLdinosaur Active Member

    Messages:
    515
    Likes Received:
    123
    Best Answers:
    11
    Trophy Points:
    95
    #6
    Yeah I think just posting a raw download speed is pretty much irrelevant. There are just too many factors that are involved in how quickly the content gets squeezed through the pipe regardless of what the pipe size is. I look at total load, total number of HTTP calls necessary an how many times the parser has to stop and wait for inline scripting stuck in the middle of the markup by the idiot who did a theme without a clue of how to properly generate a page.

    Bloat is bloat. Crap is crap. Bad practice produces bloated crap that belongs in the garbage con, not on a web site. A load of over 2mb is not a web page, it is garbage dump, and there is no polite way to describe something that, in addition to everything else, makes the HTML validator puke up 40 errors.
     
    COBOLdinosaur, Mar 26, 2015 IP
  7. minionnz

    minionnz Greenhorn

    Messages:
    17
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    20
    #7
    You can combine and minify those CSS/JS files with https://wordpress.org/plugins/w3-total-cache/
     
    minionnz, Mar 26, 2015 IP
  8. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #8
    Combining them would be almost a step the right direction, but it and minification in this case would just be sweeping bad code under the rug like a second rate Sherri Bobbins.... well, actually -- that's minification in most EVERY case; if it pays real benefits, you've probably done something wrong.

    LIKE A ****** MEGABTYE OF JS AND CSS!!!

    207k of CSS and 800k of scripting -- for that.... RIGHT. Again anyone who doesn't know what's wrong with that probably shouldn't be allowed to have a website much less build them for others.

    JS for nothing and your scripts for free, that ain't workin', that's not how you do it, lemme tell ya, these guys ARE dumb.
     
    deathshadow, Mar 26, 2015 IP
  9. minionnz

    minionnz Greenhorn

    Messages:
    17
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    20
    #9
    Wow. I disagree. If there are real benefits from minification, then you're doing something right. Any JS library should be well written, commented, spread out across multiple files for namespacing/organisation purposes and using descriptive variable and function names.

    Let's put things into perspective:

    1 - JS/CSS should be downloaded once, then cached - it doesn't change per request.
    2 - You're displaying/talking about NON-gzipped stats. Considering that gzip can cut JS file size by 70-80% (and it's already enabled on the server), it's probably closer to 2-300kb over the wire..
    3 - Half of the JS used is third-party scripts hosted on other domains, so won't be affected by browser download limits and should be downloaded in parallel.
    4 - Facebook - the 2nd (possibly 1st) most popular site in the world is over 6MB, around 3mb is JS alone. People are used to downloading this amount of data.

    The web today is a lot more dynamic than it used to be. Web page/JS sizes have more than doubled since 2010. Software is also a lot larger. I expect it to happen again over the next 5-10 years.
    The problem is being exaggerated here and I'm not sure why. I see absolutely nothing wrong the size of the site. It can be trimmed, it can be minified, much of the CSS could be removed, some of the JS might be unnecessary - but does that really mean the owners/authors and developers deserve to be insulted??

    Most of the issues you've described can be solved using common, well-known and respected web deployment practices such as minification, cross-domain hosting, gzip etc etc - something even us inept developers can do.
     
    minionnz, Mar 26, 2015 IP
  10. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #10
    You're probably used to wasting 200 to 500k of JS to do 0 to 50k's job.

    I don't have a problem with that -- my problem is that 90%+ of the scripttardery being sleazed into websites usually falls into three categories:

    1) Things I could do in 1/10th the code.

    2) Things that are CSS' job.

    3) Things that have NO damned business on a website in the first place.

    ... and that's just JS. If you don't know what's wrong with crapping out 200+K of CSS (basically ten times what a site like the above requires) then you really have ZERO business making websites. Mind you, I'm NOT saying minification is bad, but it shouldn't be used to sweep bad code under the rug and pretend it's not the REAL problem. REAL problems like a megabyte of JS and CSS doing 50k's job. But sure, it's easier that way... RIGHT.

    I laugh and/or cry when people say the framework asshattery makes it easier; starting out with 5 times the code than should be needed BEFORE you even make it actually do anything, to then add 5 times the JS and/or CSS you should need AND double the size and complexity of the markup -- while pissing away accessibility and even the entire reason to be using HTML and CSS in the first place... and it's somehow EASIER? Wow, that's some REALLY GOOD BULL right there; Religion and Politics standing in AWE and outright jealous of said quality grade A holy freaking bull.

    Let's put things into perspective:

    Which is why it belongs in an external file, not the markup and ... hey, is that me bitching about STATIC CSS and JS in the markup up above?

    Ignoring that gzip takes processor time on the host if it's not in the cache already compressed.


    Unless your connection limits are choked out by your provider, sharing connections, leaving uTorrent running in the background or streaming video while browsing.

    The first of those being the real danger with the impending bandwidth crunch that's a hefty chunk of what's causing the whole "net neutrality' debate.

    ... and why a LOT of people won't use it. I'm fortunate that for some reason it loads acceptably here, but my neighbor? FORGET IT. (2+ minutes).

    They're lucky nobody with the proper resources is making a serious effort to take them down. No, Google+ was NOT a serious effort -- stunning example of developer ineptitude though.

    If by dynamic you mean sucktastic to the point that many websites I used to frequent I don't even bother with anymore; Weather.com is a stunning example of a site that was useful 15 years ago that is such a painful mess now it might as well be the worst website ever.

    Or Newegg who's latest website update has ended my decade and a half business relationship with them. I'd sooner go to e-fence/fleabay/whateverDerogatorySlangIsPopularNow for "buy it now" listings than suffer through the slow loading impossible to use inaccessible DISASTER they have the unmitigated GALL to call a website.

    ... or Google search, where the ONLY reason I put up with it is their results are STILL that damned good; if there was an alternative with as good a results I'd have ditched them years ago -- for a WHILE it looked like DuckDuckGo was going to be an alternative, but they've gone and pissed their bed the same way Google has defeating the ENTIRE reason they were even created.

    Said pissing of the bed basically being the decision that all the things that killed off "Ask Jeeves" a little over a decade ago is now hot, trendy and has to be added to their sites.

    Well bully for you! Me, I have nothing BUT problems with the size of that site as evidenced by the 20 to 40 second plus page-load, the general accessibility issues stemming from the broken practices that created the bloat, and general incompetance used in it's creation.

    Much of that incompetance not being the developer, but the bad practices they've been taught or suckerd into by people who should know better. Sucker bait like jQuery, Bootstrap, Wordpress, and every other sleazy shortcut that has made it so the number of websites I now visit willingly can be counted on one hand.

    Puts on DI hat to mix Ermey's delivery with Samuel L. Jackson's voice:

    DAMNED ***** ***** STRAIGHT THEY **** **** NEED TO BE INSULTED!!!

    I am SICK TO DEATH of sleazeball asshat BS code bloat ignorant garbage being promoted as good practice and generally treated as acceptable by the industry as a whole. It's nube predation taking advantage of people's ignorance, apathy and just plain wishful thinking! It's about time we started getting insulting since to be frank after a decade of this nonsense and pissing all over the actual improvements in methodology STRICT gave us, simply saying "that's bad" is quite insufficient.

    Basically I'm at the point where I'm no longer playing nice -- to the point where I'm probably going to be far, FAR less tolerant and polite about it than I have been in the past.

    That last paragraph probably has those who know me shaking their heads in disbelief. Thought 2015 couldn't get worse? SUCK IT UP PRINCESS, we're not even STARTED yet!

    There's a reason right now all I have for the majority of people crapping out websites is two birds -- one for them and one for the source they code in on.

    It's either that or I say "whatever", give up on everything and go to an early grave like Dan, Tim, Joe and Radesh did. Also getting sick of outliving brilliant advocates of good practices half my age while the scumbag scam artists become the next William McCloundy or Don Lapre.
     
    deathshadow, Mar 27, 2015 IP
    COBOLdinosaur likes this.
  11. COBOLdinosaur

    COBOLdinosaur Active Member

    Messages:
    515
    Likes Received:
    123
    Best Answers:
    11
    Trophy Points:
    95
    #11
    Not much I can add to what DS said except that the biggest problem on the internet today are wannabe "experts" posting on QA sites supporting the idea that bloated, non-standards compliant pages where the semantics, accessibility, and usability are piles of vomit are part of the "new web" and all the trash being turned out for WP and other amateur tools is cool.

    It is unfortunate that web developers don't require a license or certification. As it is anyone can be a web expert and proclaim that garbage is good and we should stuff as much crap as possible on pages to make them cool. I don't know what planet they come from or how education could be so lacking that they are not capable of presenting logical arguments in favor of the crap is cool idea.
     
    COBOLdinosaur, Mar 29, 2015 IP
  12. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #12
    Aka Lame excuses -- the laugh being they just flat out ignore TWO DECADES of usability experts telling them "OH *** NO", ignore what the specifications are telling them, what guidelines are telling them, and what search engines are telling them.

    What with both Google and Baidu penalizing slow loading sites; kind of a laugh since with properly built sites Google's "page speed service" makes them slower. Of course their whole "pagespeed" thing has reeked of being taken over by the scams when sites that by their own admission take 20-40 seconds are ranked "faster' than sites that load in a quarter that time, JUST because the site that's ACTUALLY faster isn't such a bloated mess it needs a CDN.

    Though at this point Google is exhibiting the schizophrenia common to larger organizations, where everything the web-spam team is doing to slap down abusers things like the page-speed service goes and pisses all over sites with.

    ... and it's the same exact BS that they've been spouting for a decade and a half -- there's a reason most of these sites might as well be the worst site ever given how useful they are to users. But of course, they have all their lame excuses to hide behind -- or react like faithtards when confronted with facts that contradict the dogma they'e been brainwashed with; the typical ignorant "Don't give me facts, I know how I feel about it."

    All so they can continue to plod on in ineptitude while scamming the ignorant.

    Though increasingly I think legislation like that in the UK needs to be a bit wider spread; sites that fail to meet accessibility minimums -- at least for PROPER businesses -- being fined for basically being discriminatory. That would at least make a small dent in the endless horde of "gee ain't it neat" bloated BS and general developer dumbass "worst of 1997 but we call it modern" garbage.

    Hence why EVERY time someone says it's "the new web" much like my reaction to HTML 5 my response is "Really? Looks like 1997 to me." -- of course with people like Mark Cuban and Peter Schiff now predicting what I've been saying for 5 years -- a repeat of the dotcom bubble burst only worse since most web businesses have no liquidity -- having code and practices typical of that era (right down to the marketing scam advertising scam BS) just adds indicators of everyone forgetting what caused it in the first place -- Since as always those who fail to pay attention to history...
     
    deathshadow, Mar 29, 2015 IP
  13. minionnz

    minionnz Greenhorn

    Messages:
    17
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    20
    #13
    Don't try to imply I'm some "wannabe" expert - I've worked on reputable, global projects and I know my technologies/tools very well. I've been doing this for over 15 years - perhaps not as long as some other developers, but I certainly remember what it was like designing for IE4/5/6. I saw the birth of AJAX, jQuery and responsive design. I watched the evolution of JavaScript, HTML and CSS, as well as browsers moving to be more standards compliant.

    These "gee ain't it neat" bloated scripts such as jQuery exist for a reason - cross-browser JavaScript was such a pain that it was avoided as much as possible. Same for CSS.
    Sure, browsers have evolved now and it's much easier to write solid JS without the need for jQuery - but there are still a LOT of inconsistencies. When you're using a popular framework hosted on a public CDN, the end user ends up downloading LESS JavaScript than if you had created your own "I can do it in less" script.

    These aren't opinions - they're facts. They've been discussed to death elsewhere. This whole argument is getting ridiculous. You both claim that everyone else is ignorant and inept, but completely ignore any logical valid argument that doesn't fit your point of view. I can't even be bothered going into detail about any of them out because I already know how you will respond.

    WP/jQuery/CSS doesn't create bad code - developers do. You're always going to find both good developers and bad developers - implying the entire industry is bad is ignorant.
    If you seriously think the current state of the web is anywhere near as bad as 15-20 (even 5-10) years ago, you need your memory checked. We've come a long way - accessibility is still ignored a lot, I agree, but at least I can see we are heading in the right direction.

    This scaremongering is ridiculous. I came to this forum to try and help others out, but now I'm wondering if I should even bother when threads are being derailed with insults and unjustified "facts", it's everywhere. I'm done with this argument.

    @sampath1 - send me a PM if you're still looking for some constructive advice.
     
    minionnz, Mar 30, 2015 IP
  14. BrianAanan

    BrianAanan Greenhorn

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    11
    #14
    So practically, you're design house. Not bad, there's room to improve. Where're you/team from.
     
    BrianAanan, Mar 30, 2015 IP
  15. deathshadow

    deathshadow Acclaimed Member

    Messages:
    9,732
    Likes Received:
    1,998
    Best Answers:
    253
    Trophy Points:
    515
    #15
    For CSS it wasn't a pain if you had ANY clue what you were doing and bothered putting one's mind towards accessibility. Sure, IE made it a bit tough in corner cases, but if you were using semantic markup, separation of presentation from content, and learned a few simple layout methods there is no reason you can't make a modern responsive layout that gracefully degrades all the way back to IE 5.5 with little to no effort. (and to any pre-CSS browser as well! The "problem" is between pre-css and IE 5.5)

    CSS frameworks result in presentational use of classes, writing more markup, people failing to even learn what semantic markup is or why it's important, and has just as many if not MORE failings than sleazing things out in a WYSIWYG.

    For JS, you say that like it's a bad thing. No, seriously, 80%+ of the crap people vomit up in JavaScript right now falls into two categories -- CSS' job or stuff that has no blasted business on a website in the first place. You mix frameworks into that and it just adds a third category -- if it doesn't fall into those first two categories it's most always things that could be done more efficiently with less code WITHOUT the framework; without counting the size of the framework against it!

    Hoping that extra host is already resolved in the DNS cache since that can actually add more overhead than hosting it locally.

    As I just said, the "do it in less" usually does NOT include the size of the framework in that... and most of the time it would be 'less' using the framework it's crap that doesn't belong on websites in the first place or is CSS' job.

    No, they're lies. At least in my experience. Giant, dumbass lies that is nothing more than hot and trendy buzzwords duping people into thinking they are factual; again, not to compare to 'faith', but... The 'defenses' used by most of the framework using crowd make creationists sound sane and rational.

    Present a logical valid argument, then we'll talk...

    Like the developers of Wordpress, jQuery and I ASSUME you meant CSS frameworks, not CSS itself. Like turdpress shoving bad markup down your throat you can't even fix without either nebfering your upgrade path or post-processing the output. Like jQuery actively encouraging sloppy, bloated and slow scripttardery, like CSS frameworks by their very nature pissing on semantic markup and separation of presentation from content from so on high you'd thing Pesci himself just left a kegger.

    Not when the majority of the industry seems to be telling users to go **** themselves... which is what MOST of the stuff being railed against by dev's like myself and COBOLDinosaur basically is.

    Accessibility is not just ignored, it's been wiped with, pissed on, poisoned, stabbed, drawn and quartered and burned at the stake. SO much of the outright CRAP being sleazed out right now just drags site usefulness and speed back to what it was in the late 1990's. BEST CASE you might be the bleeding edge of 2003.

    At least six years ago I wasn't diving for the zoom on EVERY website -- usually just sites that had crappy forum software skins. At LEAST having broadband meant something and people TRIED to make lean sites instead of this "everyone has fast broadband and 30ms ping times" bullshit so sites like NewEgg loaded in ten seconds instead of a minute and a half.

    Frameworks, PSD jockeys calling themselves designers when lacking the knowledge to be designing a blasted thing, and off the shelf solutions being hit with a three pound lump hammer to pound that square peg into the round hole being just the TIP of the idiocy iceberg. Outright mouth-breathing stupidity like HTML 5 undoing ALL the progress and intent of 4 STRICT being even more to blame as it seems the W3C is basically shrugging it's shoulders and saying "**** it, just sleaze it out any old way" -- cowtowwing to the halfwits and fools who until recently were sleazing out HTML 3.2 and slapping 4 tranny on it. As I keep saying now they get to slap 5 lip-service around the same outmoded, broken, inaccessible and inefficient code and pat each-other on the back over how "modern" they are.

    Then help people out by providing actual advice ... basically what I've been doing here for close to a decade.

    But no, we've been hammering on your nice happy little "nothing is wrong, move along people" bubble, so that nice little cognitive bias has you going "la-la-la-la" like a second rate Vancome lady.
     
    deathshadow, Mar 30, 2015 IP
  16. COBOLdinosaur

    COBOLdinosaur Active Member

    Messages:
    515
    Likes Received:
    123
    Best Answers:
    11
    Trophy Points:
    95
    #16
    It always amazes me that the guy spouting nonsense is the one who claims to have the "facts" and is calling for logical arguments.

    So let me give you some facts. Jquery and the rest of the slopware junk damage the document object, and cause the rendering engines to jump through hoops to try and actually make their trash work. Don't think that is a fact? Then download the Firefox Gecko code, or the Chrome blink code or the webkit code, and see how much overhead is required to support the incompetents who can't code a web page without giving the browsers stuff to puke on.

    I've spent enough time inside that code to know where the problems are. I don't have to guess; and comments to support the lazy incompetent approach are neither fact based or logical. They are simply excuses to play in the garbage dump being created by the advocates of "any trash is good if some idiots is willing to pay me for it."

    Then do it by giving honest, fact based information instead of fantasyland cool and trendy responses that encourage bad proactice.
     
    COBOLdinosaur, Mar 31, 2015 IP
  17. minionnz

    minionnz Greenhorn

    Messages:
    17
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    20
    #17
    Agreed. 100%..

    And I'd argue the fact that rendering engines have to jump through hoops isn't anything to do with jQuery. I've worked with the webkit code myself. If you insist on saying otherwise, please point out parts of the code where this is apparent.
     
    minionnz, Mar 31, 2015 IP
  18. ketting00

    ketting00 Well-Known Member

    Messages:
    772
    Likes Received:
    27
    Best Answers:
    3
    Trophy Points:
    128
    #18
    People are arguing so hard but all those good practices and theories are useless for user like me, because I use custom-built PC, extreme gaming laptop and fastest internet package available at my home. And I know how to use hardware acceleration.

    On the go, I use the most powerful smartphone available on the market and on 4G service. If there are 5G available I'll be among the first adopters.

    All I need from websites I visit on mobile are content, content and content and things I can waste my time with (at party :) ). But all I need from websites I visit on PC and laptop are great user experience.

    I think developers should focus on that [at least to attract users and ultimately bring in money--disclaimer -> I'm this type of developer--, except for they want to do it for charity and pride]. But that just my opinion and what I want.

    It's not true that I use only PC when I want a snippet of code on the web. Several times I went there on mobile and shared them on a social network as bookmark to read it later. So I've several accounts on a social website.
     
    Last edited: Mar 31, 2015
    ketting00, Mar 31, 2015 IP
  19. COBOLdinosaur

    COBOLdinosaur Active Member

    Messages:
    515
    Likes Received:
    123
    Best Answers:
    11
    Trophy Points:
    95
    #19
    webkit is the rendering engine and the damage has already been done when it starts to paint. The HTML parser has already executed the error handling to fix nonsense created by invalid markup produce from jquery plugins. Because jquery is parsing the CSS declaration left to right instead of the more efficient right to left used by the CSS parsers. The result is that jquery does not always produce the correct computed value does not necessarily serialize to the CSSOM specification. Fortunately when script executes inline the parsing is suspended until the script finishes executing. That slows the load, but it gives the parsers a chance to execute the necessary error handling code to fix the mess and produce correct computed values. The problems are not directly jquery, but rather the opportunity it gives unskilled developers to make a mess.

    If jquery has some kind of recognized standards it might be possible to eliminate the conflict, but right now anything goes and the idiots who are writing most of the plugins don't care what kind of compatibility issues, conflicts, accessibility limitations, and bloat they spew out as long as they can get something to work on a demo page well enough to hook the gullible into believing they are looking at something worth using. Firefox manages to avoid some of the problems because it attaches differently then Chrome using a hash table in the rendering engines memory space. It is less efficient than then the way Chrome attaches but it was written before Chrome was even conceived of, and it eliminates the kind of re-painting issues that Chrome has been plagued with as it tries to implement new dynamics.
     
    COBOLdinosaur, Apr 2, 2015 IP
  20. PoPSiCLe

    PoPSiCLe Illustrious Member

    Messages:
    4,623
    Likes Received:
    725
    Best Answers:
    152
    Trophy Points:
    470
    #20
    The things you point out is indeed a problem, but it's not a problem with jQuery directly - it's a problem with inept developers. jQuery can be a big help (with helper plugins) if you need to do something less than simple - for instance, coding a complete set of drag-and-drop grid-systems can be a real pain in the ass if done using regular javascript - using jQuery it becomes rather simple, albeit still a bit of a code nightmare. Of course this isn't the typical everyday use, and it DOES wreak havoc on usability if the users don't have javascript, for instance - but then again, the site is meant to be used with javascript, and even though there is a fallback method (form reloading instead of ajax-events), it is cumbersome and takes lots of (unnecessary) time.
    Sometimes, the jQuery framework is cumbersome, and bloated - no doubt about it - but then, some functions are quite a lot easier, and I wish regular javascript could implement some of the shortcuts (especially selectors).
     
    PoPSiCLe, Apr 2, 2015 IP