This article nails every one of my pet peeves in modern web development, but they all come down to one glaring issue: ignorant developers assuming everybody has fat pipes and high-end gear, and...
This article nails every one of my pet peeves in modern web development, but they all come down to one glaring issue: ignorant developers assuming everybody has fat pipes and high-end gear, and going nuts with the JavaScript.
Here's the short version of a transcript of a long talk: if your website or web app doesn't provide its core functionality in lynx on a 56K dial-up connection, then you fucked up and need to fix it.
You are so very very wrong here. Developers aren't idiots or ignorant, and it is in fact ignorant and arrogant to look at this situation as if they are. Developers know there are people on 56k....
ignorant developers assuming everybody has fat pipes and high-end gear
You are so very very wrong here. Developers aren't idiots or ignorant, and it is in fact ignorant and arrogant to look at this situation as if they are. Developers know there are people on 56k. They know there are people on ridiculously high latency sat or mobile connections and that the UX is atrocious for those people. They know perfectly well, and they develop with heavy JS anyways. You MUST look at the issue with that in mind, because 'hey devs, did you know this?' is stupid - they know!
There are 2 key reasons for this, which actually boil down to 1 single reason. The intermediate reason is devs don't usually make the decision to have sites do stuff - they get paid to deliver functionality to a client, and the heavy js is required to fulfil that spec. The primary, golden reason however: clients and devs don't give a flying fuck about the people on poor connections. It's not a nice thing about the world, but it is reality and trying to deny it will just mean everything you do to try and fix it is a complete waste of everybody's time. They don't care, because the people with poor connections are, well, poor people. They are the people without the money for decent broadband, the people who live in rural areas, the people who the client doesn't need to target with their ads or platforms, who represent a shitty return on work invested for devs, who are that 20% that Pareto taken to its extreme says 'fuck those guys'. Areas with poor infrastructure are hard to target, hard to sell to, and there are so many other people who can pay more with less effort expended towards them. It's basic if unethical business.
And it's only going to get worse - much worse. I made a bit of a prediction just the other day which is based on something I've seen working with enterprise IT innovation teams for large scale banks, ERP/CRM platforms, universities, and government services (that one is fucking terrifying, imagine tiered service based on internet connection for health services). Getting the internet was the big equaliser for things like education, job hunting etc - now we're going to see class stagnation caused by connectivity lock-in. People with better internet connections will more and more be treated better.
For the yanks: the bigger threat to your Dream isn't Russia, it's Ajit Pai.
I'll eventually write a blog post about this, but I think the term "world wide web" has outlived its usefulness. I think there are actually several different overlapping webs. Or, if you prefer,...
I'll eventually write a blog post about this, but I think the term "world wide web" has outlived its usefulness. I think there are actually several different overlapping webs. Or, if you prefer, there could be several different webs.
One of them is the original hypertext web of personal websites consisting of text accented with graphics made by people for other people to read. We'll call it the "human web". The human web is badly tattered, and Google doesn't really work well for discovering interesting sites on it, but the fundamental technology is sound and can be made to work on just about any device manufactured in the last 15 years or so. And as long as you don't go nuts with the images, it still works OK on 56K dial-up. The same can be said about IRC, IM via XMPP (Jabber), and email.
The downside is that human web tech is clunky, and using it requires some knowledge and effort. Unix users can use a variety of static site generators, Mac users can use slick apps like RapidWeaver, but I'm not sure there's anything to help the average Windows user build and deploy static websites easily. Also, while domain registration and hosting isn't too expensive, it isn't exactly free, either.
I use neocities to host my own web site, and I pay absolutely nothing. I get an entire gigabyte of space, which is honestly more than I'd ever use with text, and while it's more of a problem for...
The downside is that human web tech is clunky... Also, while domain registration and hosting isn't too expensive, it isn't exactly free, either.
I use neocities to host my own web site, and I pay absolutely nothing. I get an entire gigabyte of space, which is honestly more than I'd ever use with text, and while it's more of a problem for images, it isn't so horrible. I don't think it's too clunky, but sure, it requires knowledge and effort. I think that's a good thing because any platform that doesn't require knowledge and effort will have people on it that have very little knowledge, and put in very little effort. In my opinion if someone cannot grasp how to use HTML to, at the very least, display some text on a web page, they shouldn't be making content on the Web, for the most part.
The human web is badly tattered
have you heard of wiby.me ? It's a search engine that indexes only sites that I think you'd call the "human web," I recommend you visit it when you have some free time. It really feels refreshing to use, as if you've gone outside the walled garden, so to speak, but in this case the real beauty is on the outside of the garden, not inside.
Of the top 100 websites in the world, how many would realistically render and function correctly in lynx today? Can it even understand modern layouts like CSS grid or flexbox?
if your website or web app doesn't provide its core functionality in lynx on a 56K dial-up connection, then you fucked up and need to fix it.
Of the top 100 websites in the world, how many would realistically render and function correctly in lynx today? Can it even understand modern layouts like CSS grid or flexbox?
No. And a bunch of the world's browsers can't either. Support for CSS Grid was only added to Android's browser, and the equivalent Android Chrome, in May, last year. Consider for a moment how...
Can it even understand modern layouts like CSS grid or flexbox?
No. And a bunch of the world's browsers can't either.
Support for CSS Grid was only added to Android's browser, and the equivalent Android Chrome, in May, last year. Consider for a moment how crappy cheap, and woefully out of date, Android phones are effectively the computers of the third world.
Are you sure? Grid is still fairly new, but already has very strong support. Flexbox has been well supported for a few years now too. https://caniuse.com/#feat=css-grid...
No. And a bunch of the world's browsers can't either.
Are you sure? Grid is still fairly new, but already has very strong support. Flexbox has been well supported for a few years now too.
With the advent of evergreen browsers, support for new standards can actually be introduced very quickly today.
Regarding Android's browser though, I wouldn't be too surprised if it were being neglected. Google seems to have moved focus to their own apps versus AOSP.
Yes, I am very sure Grid does not have strong support the world over. One study gives me 25.5% of Android phones are on are on Marshmallow, and 22.9% are on Nougat 7.0. Both will never receive the...
Consider for a moment how crappy cheap, and woefully out of date, Android phones are effectively the computers of the third world.
Yes, I am very sure Grid does not have strong support the world over.
One study gives me 25.5% of Android phones are on are on Marshmallow, and 22.9% are on Nougat 7.0.
Both will never receive the latest Chrome updates, (unless their manufacturer does the backporting work themselves).
So neither of those have Grid support.
Let's be super-generous and say 50% of manufacturers backport the updates. That still leaves a quarter of active Android phones without support.
According to this, Chrome for Android supports 4.0 (Ice Cream Sandwich) and up. Devices with M (6.0) and O (7.0) definitely support Chrome. You're probably thinking of WebView, which used to be a...
One study gives me 25.5% of Android phones are on are on Marshmallow, and 22.9% are on Nougat 7.0.
Both will never receive the latest Chrome updates, (unless their manufacturer does the backporting work themselves).
According to this, Chrome for Android supports 4.0 (Ice Cream Sandwich) and up. Devices with M (6.0) and O (7.0) definitely support Chrome. You're probably thinking of WebView, which used to be a system app updated with every new Android release. For devices with 5.0 (Lollipop) and up, Google offers WebView in the Play Store, allowing it to be updated with new features and security patches. It is based on Chrome, but for devices running 7.0+ with Chrome installed, it lets Chrome render pages. Developers can now use Chrome Custom Tabs (introduced in Chrome 45), which are better than WebView because they can use Chrome cookies, autofill, data saver, etc. in addition to being faster and (according to Google's marketing materials) offering better customization and app integration.
The state of web development for Android is a lot better than it used to be. Android version doesn't mean as much as it used to, as long as OEMs install Google's WebView APKs rather than compiling their own.
I wasn't. I was thinking of devices that can't update. Of all the devices costing less than $150, which either: Don't have the Play Store installed Are listed as incompatible for updates Or have...
You're probably thinking of WebView, which used to be a system app updated with every new Android release.
I wasn't. I was thinking of devices that can't update. Of all the devices costing less than $150, which either:
You can install apps from external sources - not just Play Store You can install other browsers - not just Chrome Market share of <4.0 android versions is about 0.2% I'm not sure I understand your...
You can install apps from external sources - not just Play Store
You can install other browsers - not just Chrome
Market share of <4.0 android versions is about 0.2%
I'm not sure I understand your comment. What exactly are you trying to point out? Do you still stand behind your previous comment about chrome?
I do, and I'm not sure how well you've read my arguments. Cool. However, I pointed to marketshare of other versions, that comprises 25%. Not really relevant when speaking of the average user....
I do, and I'm not sure how well you've read my arguments.
Market share of <4.0 android versions is about 0.2%
Cool. However, I pointed to marketshare of other versions, that comprises 25%.
You can install apps from external sources - not just Play Store
Not really relevant when speaking of the average user.
You can install other browsers - not just Chrome
Indeed, I use Firefox. But the overwhelming majority use what came with the phone.
So you do stand behind this argument? Sorry, but this looks like trolling. Do you have some source? According to Google, >=4.0 will receive newest chrome updates. The older (0.2% from my source,...
So you do stand behind this argument?
[android phones 6.0 ans 7.0] will never receive the latest Chrome updates
Sorry, but this looks like trolling. Do you have some source? According to Google, >=4.0 will receive newest chrome updates. The older (0.2% from my source, 0.3% from your source) never even had an option to install Chrome.
I didn't say anything about rendering "correctly", whatever that even means, and if you can't implement your functionality using only HTTP GET and POST, then that's your problem.
I didn't say anything about rendering "correctly", whatever that even means, and if you can't implement your functionality using only HTTP GET and POST, then that's your problem.
Rendering correctly means to spec, of course. That's what web standards are for. And if your browser isn't capable of supporting those standards, then it isn't a very good browser.
Rendering correctly means to spec, of course. That's what web standards are for. And if your browser isn't capable of supporting those standards, then it isn't a very good browser.
I'll start by saying that I agree almost entirely with the conclusion - sure, maybe "lynx on 56k" is a more extreme barrier than I'd set, but that's splitting hairs. I'm with you, and with the...
I'll start by saying that I agree almost entirely with the conclusion - sure, maybe "lynx on 56k" is a more extreme barrier than I'd set, but that's splitting hairs. I'm with you, and with the original author.
What I find odd is that despite agreeing with the article as a whole, most of the actual points in it just don't match what I've seen in the industry. To take a few key points:
Yet we just don’t care about page load times. Does the average site load within 3 seconds? Does yours? We don’t test this regularly. It is not part of our standard metrics in our industry, despite it being so tightly correlated with user satisfaction.
It is absolutely part of the standard metrics. Server response time is a top line chart on literally every system I've used; both Google Analytics and the Chrome devtools put a huge focus on overall load time. Whole streams of talks at major conferences are focused on performance, often shaving just tens of milliseconds. The assertion that any decent dev wouldn't be checking those metrics weekly at the very least is totally alien to me.
30% of the rural USA is closer to dial-up speeds than broadband speeds.
I read the references, including the PDFs from the FCC, and I can't reconcile this with what they say. Just because they're below the 25Mbit/s (or even the older 4Mbit/s) definition of broadband doesn't mean they're anywhere near being closer to dial up. These are two order of magnitude differences. The next paragraph in the article appears to mix up kilobits/kilobytes, so I may be misinterpreting, but the idea of 1-5 minute load times for any appreciable portion of the population doesn't seem to fit the data. Sure, 10+ seconds is still unacceptable, but it's a very different issue.
Instead of HTML being generated on, and delivered from, the server, a JS bundle is sent to the client, which is then decompressed and initialised and then requests data, which is then sent from the server (or another server, as now everything is a service) as JSON, where it is then converted on the fly into HTML
One of the major selling points of React, Angular 2, etc. was the ability to render on the server, send HTML, and then let the client JS pick up afterwards. I agree wholeheartedly that we should tar and feather devs who don't use this capability - but at least anecdotally, I saw people jumping for joy because they could render in one line rather than using the clunky headless browser solutions they'd previously been using (at great effort) to mitigate that exact problem.
You’re a web developer. Your job is to make a site work for everyone, in all conditions.
On this, I couldn't agree more!
I worry a little about being overly critical - after all, why argue if we ultimately agree? - but there was enough in there that struck me as odd that it started to detract from the excellent points on empathy and accessibility.
I fucking hate javascript. I hate programming in it. I hate seeing it on web pages. I hate how people hack it to do things it's not supposed to do. I hate when people hacking it to do things it's...
I fucking hate javascript. I hate programming in it. I hate seeing it on web pages. I hate how people hack it to do things it's not supposed to do. I hate when people hacking it to do things it's not supposed to do causes problems which I have to solve by hacking it even more, or hacking another program, because someone decided to program in a language that should be killed already.
I run uMatrix not because I'm worried about my browser being hijacked, but because I don't want a page to take a million years to load (I'm on 1gbps internet, so that's not the problem). Sure, it's nice that they have trouble tracking my data if I do this, but honestly it's because JS can fuck right off, down into a hole and can preferably be set on fire on it's way out to maximize it's suffering.
I didn't want to say it on my earlier comment, but I hate writing JavaScript too. It's a terrible, terrible language, and it wears the "design by committee" badge like no other language ever has....
I didn't want to say it on my earlier comment, but I hate writing JavaScript too. It's a terrible, terrible language, and it wears the "design by committee" badge like no other language ever has. It's the Calvinball of programming. It has just about every feature under the sun but you shouldn't use them because they are implemented terribly. And yet somehow it still does not have type hinting.
I personally don't care about JS. It could've been COBOL or Haskell, that is not the problem for me, it is the fact that JS is a remote code execution vulnerability. Ideally a browser should warn...
I personally don't care about JS. It could've been COBOL or Haskell, that is not the problem for me, it is the fact that JS is a remote code execution vulnerability. Ideally a browser should warn before allowing it to run just like it does with notifications or location. I hate the idea of running random Turing complete code on my computer and in a piece if software, namely the browser, that is so intimate to my private life.
There already are - look at just about any other coding language. The reason it gets used is developers are lazy and want to restrict how much is being done on their servers and offload more to...
There already are - look at just about any other coding language.
The reason it gets used is developers are lazy and want to restrict how much is being done on their servers and offload more to you. Since most computers have js, they go to it first.
I think Gaywallet missed the point a little. At present there's not really a good alternative to run in-browser. The problem is that your website would need to distribute the code to be run in the...
I think Gaywallet missed the point a little. At present there's not really a good alternative to run in-browser. The problem is that your website would need to distribute the code to be run in the browser and the browsers would have to support it. Browsers only support JS, so websites are built in JS, and websites are built in JS so browsers are only supporting JS. It's a catch-22. On top of that, there's the problem with supporting legacy web browsers because some people refuse to move into the current decade. And finally there's the question of "which language would we use?", which everyone will totally come to agreement on (sarcasm, if it wasn't obvious enough).
There is an attempt at fixing this problem, though. It's called WebAssembly (wasm), which should in time allow you to write in a completely different language of your choice, compile down to wasm, and run it natively in your browser. No specific programming language and (if I remember correctly) completely inter-operable with JS.
If it does take off, then it will be a while before it does. Tooling is going to be important, as well as browser support. With any luck, though, it could end up competing fairly well against JS.
Good luck with getting JS to die completely, though. It's going to be around until people decide to stop using it, which is likely to be never.
As a developer, I love that I can introduce new functionality to my websites with JavaScript. As a user, I hate excessive JavaScript with a passion. I'm tired of pages hijacking my scrolling,...
As a developer, I love that I can introduce new functionality to my websites with JavaScript.
As a user, I hate excessive JavaScript with a passion. I'm tired of pages hijacking my scrolling, popping up elements to advertise to me, and losing basic browser features because an idiot decided they needed to arbitrarily enforce their own UX decisions. Most of all, I hate that I have to wait while the computer freezes because some asshole who runs the website decided that it was better to offset complex rendering jobs to my computer instead of theirs.
And Angular is just terrible no matter which way you view it.
If all I am trying to do is read an article or some basic dataset, I don't need an entire application to load and render it. Just serve plain HTML and CSS.
Oh, you started moving that annoying mouse wheel half a millimetre? Enjoy the functionality of a really slow PgDn because we only use our keyboards to navigate the internet and wanted to make it...
Oh, you started moving that annoying mouse wheel half a millimetre? Enjoy the functionality of a really slow PgDn because we only use our keyboards to navigate the internet and wanted to make it functional for you, mouse users!
You dared move the mouse outside of the viewport? Here's a pop-up desperately asking you not to leave, even though you didn't have any intention to leave at that point
You dared move the mouse outside of the viewport?
Here's a pop-up desperately asking you not to leave, even though you didn't have any intention to leave at that point
There are problems with how devs at large use JS but this attitude is very harmful. It lays the past in such a golden light that it fails to see a way moving forward. JS the programming language...
There are problems with how devs at large use JS but this attitude is very harmful. It lays the past in such a golden light that it fails to see a way moving forward. JS the programming language of the web and its not going anywhere. We need to teach people to use JS better. But SPAs are far far better than SRA for big applications. for a site, yeah just push down your static HTML.
Harmful for whom? For a language you tout as being "the programming language of the web", it doesn't seem to do much besides adtech. Everything else you can do with JavaScript can be done...
Harmful for whom? For a language you tout as being "the programming language of the web", it doesn't seem to do much besides adtech. Everything else you can do with JavaScript can be done server-side, or in native apps.
From what I've seen, corporations push JS because there are a shitload of JS devs and most of them aren't old enough to know better, so they're dirt cheap.
The reason I look backward is that most developers have no sense of history. They just keep reinventing the wheel with new programming languages and frameworks.
web apps are the replacement for 98% of native apps, the browser functionality is just very new relative to the web's existence that the apps haven't been made yet. I make a bunch of these "native...
or in native apps
web apps are the replacement for 98% of native apps, the browser functionality is just very new relative to the web's existence that the apps haven't been made yet. I make a bunch of these "native replacement" type apps and host them on https://apps.nektro.net/ with more to come. It's on github too. I'm also looking at web apps like https://mobile.twitter.com/ and similar.
We're never going to be able to agree because you seem to be OK with the notion of web browsers as virtual machines, and I'm not. If I wanted to be tied to the internet at all times, I would use a...
We're never going to be able to agree because you seem to be OK with the notion of web browsers as virtual machines, and I'm not.
If I wanted to be tied to the internet at all times, I would use a Chromebook as my primary computer instead of a used ThinkPad running OpenBSD. If I wanted to do all of my computing on an incredibly powerful machine that somebody else owned and operated, I would get a free shell account on sdf.org or pay PANIX.com ten bucks a month.
I don't think web apps are new. I think they're just the latest incarnation of the dream that drove GE, Honeywell, and AT&T to invest hundreds of hours and millions of dollars in the development of MULTICS. Web apps, SaaS, and PaaS are just the old timesharing model of dumb terminals talking to a mainframe rebranded with new buzzwords.
I don't blame you for investing in web apps, but if they really are the future then you can keep it. Maybe I'm just paranoid, but I don't trust computers that aren't under my direct control.
Then again, you're almost stuck with using the browser as a hardware abstraction layer on mobile devices because Google and Apple pretty much ignored the entire history of Unix, POSIX, and desktop PCs when designing and implementing Android and iOS so that there is neither a common API, standard library, or programming language you can use to directly target both platforms that isn't the fucking browser, HTML5, and JavaScript.
Don't distribute apps. Distribute source code. Write and document your code so that it can be ported to every platform for which there's sufficient demand. This is why browsers like Chromium (the...
Don't distribute apps. Distribute source code. Write and document your code so that it can be ported to every platform for which there's sufficient demand.
This is why browsers like Chromium (the open source version of Chrome) and Firefox are available for Windows, OSX, hundreds of GNU/Linux distributions, FreeBSD, NetBSD, OpenBSD, etc.
It's hard to do this on mobile because Google and Apple don't make any effort at compatibility or interoperability. That leaves developers like you stuck using a web browser as a hardware abstraction layer. You should be able to code for mobile at a lower level, but that's unnecessarily difficult for you because there isn't anything like POSIX for mobile devices even though both Android and iOS are supposedly based on Unix via Linux and OSX, respectively.
Twitter is the perfect example of the problematic nature of websites as heavy JavaScript applications. No amount of dynamic content will justify the performance penalty and extra waiting for a...
Twitter is the perfect example of the problematic nature of websites as heavy JavaScript applications. No amount of dynamic content will justify the performance penalty and extra waiting for a single poorly thought out sentence. I just clicked a link to a tweet on a news story, and it took 15 seconds to load.
Without JS, you would have to reload the page any time you wanted to perform the simplest action, e.g. clicking the "Vote" link on a comment here. Yes, you could still perform that action, but...
Everything else you can do with JavaScript can be done server-side, or in native apps.
Without JS, you would have to reload the page any time you wanted to perform the simplest action, e.g. clicking the "Vote" link on a comment here. Yes, you could still perform that action, but that's a ton of unnecessary overhead.
I don't want to install dozens of shitty apps just to enable such a simple piece of functionality as "don't refresh the entire damn page when all I want to do is click 'Vote' and move on". I have a native app that's cross-platform and supports content from just about any source you could think of with built-in support for that kind of functionality: it's called a web browser with JS enabled.
I do despise the current state of JS, though. I shouldn't have to get started with a bundler, transpiler, polyfills, and five different fucking package managers just to install the one goddamn library I need. This whole JS-everywhere mentality and over-reliance on bleeding-edge tech is just one giant catastrophe waiting to happen.
Well, yeah. JSON is a fairly standard data format for transporting data now. If you want the ability to efficiently reload data, or if you want a separate mobile application, or if you want to...
Instead of HTML being generated on, and delivered from, the server, a JS bundle is sent to the client, which is then decompressed and initialised and then requests data, which is then sent from the server (or another server, as now everything is a service) as JSON, where it is then converted on the fly into HTML.
Well, yeah. JSON is a fairly standard data format for transporting data now. If you want the ability to efficiently reload data, or if you want a separate mobile application, or if you want to standardize your generated HTML to limit inconsistencies, then naturally you're going to want to move to a pure JSON setup.
Yet consider this: we know that 53% of users leave a site if the time to user interaction is greater than 3 seconds. . . Yet we just don’t care about page load times. Does the average site load within 3 seconds? Does yours? We don’t test this regularly. It is not part of our standard metrics in our industry, despite it being so tightly correlated with user satisfaction.
The problem isn't JS, it's bloated JS. Pre-rendered HTML content can cause initial page load times to seem significantly higher than an empty page with some lightweight JS ready to make an Ajax request or two. By making the page seem more responsive on initial load, a user's patience isn't as quick to wear out, which gives you a bit more time to get that data loaded and rendered from the JSON endpoint(s). You can also put up a loading animation of some kind to alleviate the user's irrational impatience and make it clear to them that things are actually loading.
In other words, there are a lot of different incentives to move toward JSON + client-side rendering of HTML, and user impatience is one of them. It's often the case that JSON + client-side rendering is selected specifically as a solution to solve the user impatience problem.
For reference, the JS libraries I use unminified take up roughly 160KB. Minified, they take up just over 62KB total. Any JS I write on top of that is pretty much negligible. After initial load, those libraries are cached and every single request thereafter benefits from reduced bandwidth usage.
But, again, the problem is bloated JS. The reddit redesign is a good example of JS done very, very wrong. On my connection, the code I've written for the web portal I recently built has a total load time of around 2 seconds and a total transfer of <180KB for the unminified version, and <80KB for the minified version. Compare that to an old topic of mine showing the reddit redesign's network load.
In short, I agree that the state of modern JS is terrible, but let's not equate "modern JS" with "all JS", because some JS is good and, quite frankly, necessary for a functional web experience. I don't know about you, but personally I like being able to click "Vote" and not have the entire page reload just to submit that tiny request.
I think the issue isn't so much the raw filesize but the time it takes to execute the JS for a webapp. My website is 163KB but with 400KB/s + 400 latency and 6x CPU slowdown to simulate mobile it...
I think the issue isn't so much the raw filesize but the time it takes to execute the JS for a webapp. My website is 163KB but with 400KB/s + 400 latency and 6x CPU slowdown to simulate mobile it still takes about 3000ms to render. Without the network throttle it renders in about 1500ms, suggesting that at a minimum it's never going to render under that on a weak computer. If I had made the site server side instead of a webapp it would almost certainly be <100ms.
Maybe I'm wrong though. I'm pretty new to web development and it's only recently that performance came to my attention as a critical issue.
You're right that not all assets are created equally, and Javascript will be slower than CSS or images due to processing time. That said, it's probably the fastest interpreted language today....
You're right that not all assets are created equally, and Javascript will be slower than CSS or images due to processing time. That said, it's probably the fastest interpreted language today. Modern browsers are JIT'd out the wazoo, and I've never found processing time to be a significant consideration versus transit time.
If you're seeing very large numbers, you may want to play with Chrome's Performance tab in dev tools. It offers a very comprehensive breakdown of exactly what your browser is doing in that time (eg. CSS repaints, layout calculation, JS events firing). If there's a slow function, this may help reveal it.
Here's a detailed (and long) article that goes into this topic in quite a bit of depth: https://medium.com/@addyosmani/the-cost-of-javascript-in-2018-7d8950fbb5d4
Well said, and I agree with you completely. It's entirely possible to write fast and efficient Javascript. This website is a perfect example of that fact.
Well said, and I agree with you completely. It's entirely possible to write fast and efficient Javascript. This website is a perfect example of that fact.
It's of course worth noting that (most) developers are at the mercy of a client or someone higher up. I'm sure there are plenty of developers that would love to bring down page load times and...
It's of course worth noting that (most) developers are at the mercy of a client or someone higher up. I'm sure there are plenty of developers that would love to bring down page load times and website sizes in general by removing scripts, pop up windows, fancy scrolling, video backgrounds, etc. But clients today expect and request them. Why? Because their competitors have them.
You can try to persuade them away from the shit we find super annoying, but if you actually succeed in doing so and their marketing doesn't do as well as expected, they'll likely blame you for not adding in all that annoying shit from the beginning. Developers aren't purposefully adding in this crap - someone higher up with little to no knowledge of how a website is built is pulling the strings. And when they have a marketing department showing how well those little pop-ups bring down bounce rates and increase site interactions, it doesn't help your case.
And of course, even measuring the effect of the marketing effort requires adding in more scripts, allowing 3rd parties access to personal data and add in even more bloat. In the end, the bloat...
And of course, even measuring the effect of the marketing effort requires adding in more scripts, allowing 3rd parties access to personal data and add in even more bloat. In the end, the bloat does absolutely nothing for the person actually interested in a site's content.
This is lots of logorrhoea stating what's obvious at the end of the day. Also: Yes, you say in a footnote that this is a joke, but this is not a joke. If there's one original invention at the...
This is lots of logorrhoea stating what's obvious at the end of the day. Also:
Despite what Tim Berners-Lee claims, the Web wasn’t a single invention. Yes, we know that one man says that he invented it (grudging thanks to Tim), but he did so on the back of a thousand other technologies, inspired and enabled to build by their previous work.
Yes, you say in a footnote that this is a joke, but this is not a joke. If there's one original invention at the beginning of the Holocene, any invention after that one was not a "single" invention. Everybody stands on the shoulders of giants, making an inverse pyramid. But there is more to this website. The tone of the first few paragraphs is rather off-putting, and the navigation on the left has a link "about her". Why do I need to know your gender? The first lines of the text the author refers to themselves as an "old lady", but when I go to the "about her" page, they say they're 30-something. Since when has that become old? And why, in a talk titled "Dear Developers, The Web Isn't About You", Ublock Origin tells me that it's blocking 178 requests?
What happened to being serious and smart, and why did talking to a public degraded to condescending awkward "humour" sprinkled with fact-ish thingies in this industry? I'm glad I left it before it was too late.
This article nails every one of my pet peeves in modern web development, but they all come down to one glaring issue: ignorant developers assuming everybody has fat pipes and high-end gear, and going nuts with the JavaScript.
Here's the short version of a transcript of a long talk: if your website or web app doesn't provide its core functionality in lynx on a 56K dial-up connection, then you fucked up and need to fix it.
You are so very very wrong here. Developers aren't idiots or ignorant, and it is in fact ignorant and arrogant to look at this situation as if they are. Developers know there are people on 56k. They know there are people on ridiculously high latency sat or mobile connections and that the UX is atrocious for those people. They know perfectly well, and they develop with heavy JS anyways. You MUST look at the issue with that in mind, because 'hey devs, did you know this?' is stupid - they know!
There are 2 key reasons for this, which actually boil down to 1 single reason. The intermediate reason is devs don't usually make the decision to have sites do stuff - they get paid to deliver functionality to a client, and the heavy js is required to fulfil that spec. The primary, golden reason however: clients and devs don't give a flying fuck about the people on poor connections. It's not a nice thing about the world, but it is reality and trying to deny it will just mean everything you do to try and fix it is a complete waste of everybody's time. They don't care, because the people with poor connections are, well, poor people. They are the people without the money for decent broadband, the people who live in rural areas, the people who the client doesn't need to target with their ads or platforms, who represent a shitty return on work invested for devs, who are that 20% that Pareto taken to its extreme says 'fuck those guys'. Areas with poor infrastructure are hard to target, hard to sell to, and there are so many other people who can pay more with less effort expended towards them. It's basic if unethical business.
And it's only going to get worse - much worse. I made a bit of a prediction just the other day which is based on something I've seen working with enterprise IT innovation teams for large scale banks, ERP/CRM platforms, universities, and government services (that one is fucking terrifying, imagine tiered service based on internet connection for health services). Getting the internet was the big equaliser for things like education, job hunting etc - now we're going to see class stagnation caused by connectivity lock-in. People with better internet connections will more and more be treated better.
For the yanks: the bigger threat to your Dream isn't Russia, it's Ajit Pai.
Honestly, its surprising how many sites work fine with noscript
I'll eventually write a blog post about this, but I think the term "world wide web" has outlived its usefulness. I think there are actually several different overlapping webs. Or, if you prefer, there could be several different webs.
One of them is the original hypertext web of personal websites consisting of text accented with graphics made by people for other people to read. We'll call it the "human web". The human web is badly tattered, and Google doesn't really work well for discovering interesting sites on it, but the fundamental technology is sound and can be made to work on just about any device manufactured in the last 15 years or so. And as long as you don't go nuts with the images, it still works OK on 56K dial-up. The same can be said about IRC, IM via XMPP (Jabber), and email.
The downside is that human web tech is clunky, and using it requires some knowledge and effort. Unix users can use a variety of static site generators, Mac users can use slick apps like RapidWeaver, but I'm not sure there's anything to help the average Windows user build and deploy static websites easily. Also, while domain registration and hosting isn't too expensive, it isn't exactly free, either.
I use neocities to host my own web site, and I pay absolutely nothing. I get an entire gigabyte of space, which is honestly more than I'd ever use with text, and while it's more of a problem for images, it isn't so horrible. I don't think it's too clunky, but sure, it requires knowledge and effort. I think that's a good thing because any platform that doesn't require knowledge and effort will have people on it that have very little knowledge, and put in very little effort. In my opinion if someone cannot grasp how to use HTML to, at the very least, display some text on a web page, they shouldn't be making content on the Web, for the most part.
Of the top 100 websites in the world, how many would realistically render and function correctly in lynx today? Can it even understand modern layouts like CSS grid or flexbox?
No. And a bunch of the world's browsers can't either.
Support for CSS Grid was only added to Android's browser, and the equivalent Android Chrome, in May, last year. Consider for a moment how crappy cheap, and woefully out of date, Android phones are effectively the computers of the third world.
Are you sure? Grid is still fairly new, but already has very strong support. Flexbox has been well supported for a few years now too.
https://caniuse.com/#feat=css-grid
https://caniuse.com/#feat=flexbox
With the advent of evergreen browsers, support for new standards can actually be introduced very quickly today.
Regarding Android's browser though, I wouldn't be too surprised if it were being neglected. Google seems to have moved focus to their own apps versus AOSP.
Yes, I am very sure Grid does not have strong support the world over.
One study gives me 25.5% of Android phones are on are on Marshmallow, and 22.9% are on Nougat 7.0.
Both will never receive the latest Chrome updates, (unless their manufacturer does the backporting work themselves).
So neither of those have Grid support.
Let's be super-generous and say 50% of manufacturers backport the updates. That still leaves a quarter of active Android phones without support.
Android represents roughly 85% of smartphone usage.
So roughly 40% (I rounded down) of smartphone users will not have Grid support.
According to this, Chrome for Android supports 4.0 (Ice Cream Sandwich) and up. Devices with M (6.0) and O (7.0) definitely support Chrome. You're probably thinking of WebView, which used to be a system app updated with every new Android release. For devices with 5.0 (Lollipop) and up, Google offers WebView in the Play Store, allowing it to be updated with new features and security patches. It is based on Chrome, but for devices running 7.0+ with Chrome installed, it lets Chrome render pages. Developers can now use Chrome Custom Tabs (introduced in Chrome 45), which are better than WebView because they can use Chrome cookies, autofill, data saver, etc. in addition to being faster and (according to Google's marketing materials) offering better customization and app integration.
The state of web development for Android is a lot better than it used to be. Android version doesn't mean as much as it used to, as long as OEMs install Google's WebView APKs rather than compiling their own.
I wasn't. I was thinking of devices that can't update. Of all the devices costing less than $150, which either:
I'm not sure I understand your comment. What exactly are you trying to point out? Do you still stand behind your previous comment about chrome?
I do, and I'm not sure how well you've read my arguments.
Cool. However, I pointed to marketshare of other versions, that comprises 25%.
Not really relevant when speaking of the average user.
Indeed, I use Firefox. But the overwhelming majority use what came with the phone.
So you do stand behind this argument?
Sorry, but this looks like trolling. Do you have some source? According to Google, >=4.0 will receive newest chrome updates. The older (0.2% from my source, 0.3% from your source) never even had an option to install Chrome.
I didn't say anything about rendering "correctly", whatever that even means, and if you can't implement your functionality using only HTTP GET and POST, then that's your problem.
Rendering correctly means to spec, of course. That's what web standards are for. And if your browser isn't capable of supporting those standards, then it isn't a very good browser.
I'll start by saying that I agree almost entirely with the conclusion - sure, maybe "lynx on 56k" is a more extreme barrier than I'd set, but that's splitting hairs. I'm with you, and with the original author.
What I find odd is that despite agreeing with the article as a whole, most of the actual points in it just don't match what I've seen in the industry. To take a few key points:
It is absolutely part of the standard metrics. Server response time is a top line chart on literally every system I've used; both Google Analytics and the Chrome devtools put a huge focus on overall load time. Whole streams of talks at major conferences are focused on performance, often shaving just tens of milliseconds. The assertion that any decent dev wouldn't be checking those metrics weekly at the very least is totally alien to me.
I read the references, including the PDFs from the FCC, and I can't reconcile this with what they say. Just because they're below the 25Mbit/s (or even the older 4Mbit/s) definition of broadband doesn't mean they're anywhere near being closer to dial up. These are two order of magnitude differences. The next paragraph in the article appears to mix up kilobits/kilobytes, so I may be misinterpreting, but the idea of 1-5 minute load times for any appreciable portion of the population doesn't seem to fit the data. Sure, 10+ seconds is still unacceptable, but it's a very different issue.
One of the major selling points of React, Angular 2, etc. was the ability to render on the server, send HTML, and then let the client JS pick up afterwards. I agree wholeheartedly that we should tar and feather devs who don't use this capability - but at least anecdotally, I saw people jumping for joy because they could render in one line rather than using the clunky headless browser solutions they'd previously been using (at great effort) to mitigate that exact problem.
On this, I couldn't agree more!
I worry a little about being overly critical - after all, why argue if we ultimately agree? - but there was enough in there that struck me as odd that it started to detract from the excellent points on empathy and accessibility.
I fucking hate javascript. I hate programming in it. I hate seeing it on web pages. I hate how people hack it to do things it's not supposed to do. I hate when people hacking it to do things it's not supposed to do causes problems which I have to solve by hacking it even more, or hacking another program, because someone decided to program in a language that should be killed already.
I run uMatrix not because I'm worried about my browser being hijacked, but because I don't want a page to take a million years to load (I'm on 1gbps internet, so that's not the problem). Sure, it's nice that they have trouble tracking my data if I do this, but honestly it's because JS can fuck right off, down into a hole and can preferably be set on fire on it's way out to maximize it's suffering.
There's a reason I keep Douglas Crockford's JavaScript: the Good Parts with other fantasy novels.
There are no good parts.
The good part is that you can choose to not use it.
And I've done exactly that with sites I own. Like this motherfucker.
Oh. I expected this website.
I didn't want to say it on my earlier comment, but I hate writing JavaScript too. It's a terrible, terrible language, and it wears the "design by committee" badge like no other language ever has. It's the Calvinball of programming. It has just about every feature under the sun but you shouldn't use them because they are implemented terribly. And yet somehow it still does not have type hinting.
I personally don't care about JS. It could've been COBOL or Haskell, that is not the problem for me, it is the fact that JS is a remote code execution vulnerability. Ideally a browser should warn before allowing it to run just like it does with notifications or location. I hate the idea of running random Turing complete code on my computer and in a piece if software, namely the browser, that is so intimate to my private life.
You seem like a knowledgeable folk: do you think there will ever be a true alternative to Javascript?
There already are - look at just about any other coding language.
The reason it gets used is developers are lazy and want to restrict how much is being done on their servers and offload more to you. Since most computers have js, they go to it first.
Even for front-end?
I think Gaywallet missed the point a little. At present there's not really a good alternative to run in-browser. The problem is that your website would need to distribute the code to be run in the browser and the browsers would have to support it. Browsers only support JS, so websites are built in JS, and websites are built in JS so browsers are only supporting JS. It's a catch-22. On top of that, there's the problem with supporting legacy web browsers because some people refuse to move into the current decade. And finally there's the question of "which language would we use?", which everyone will totally come to agreement on (sarcasm, if it wasn't obvious enough).
There is an attempt at fixing this problem, though. It's called WebAssembly (wasm), which should in time allow you to write in a completely different language of your choice, compile down to wasm, and run it natively in your browser. No specific programming language and (if I remember correctly) completely inter-operable with JS.
If it does take off, then it will be a while before it does. Tooling is going to be important, as well as browser support. With any luck, though, it could end up competing fairly well against JS.
Good luck with getting JS to die completely, though. It's going to be around until people decide to stop using it, which is likely to be never.
As a developer, I love that I can introduce new functionality to my websites with JavaScript.
As a user, I hate excessive JavaScript with a passion. I'm tired of pages hijacking my scrolling, popping up elements to advertise to me, and losing basic browser features because an idiot decided they needed to arbitrarily enforce their own UX decisions. Most of all, I hate that I have to wait while the computer freezes because some asshole who runs the website decided that it was better to offset complex rendering jobs to my computer instead of theirs.
And Angular is just terrible no matter which way you view it.
If all I am trying to do is read an article or some basic dataset, I don't need an entire application to load and render it. Just serve plain HTML and CSS.
Oh you started scrolling on the web page? Have a full sized window with a button you have to click that serves no purpose other than to annoy you.
Oh, you started moving that annoying mouse wheel half a millimetre? Enjoy the functionality of a really slow PgDn because we only use our keyboards to navigate the internet and wanted to make it functional for you, mouse users!
You dared move the mouse outside of the viewport?
Here's a pop-up desperately asking you not to leave, even though you didn't have any intention to leave at that point
There are problems with how devs at large use JS but this attitude is very harmful. It lays the past in such a golden light that it fails to see a way moving forward. JS the programming language of the web and its not going anywhere. We need to teach people to use JS better. But SPAs are far far better than SRA for big applications. for a site, yeah just push down your static HTML.
Harmful for whom? For a language you tout as being "the programming language of the web", it doesn't seem to do much besides adtech. Everything else you can do with JavaScript can be done server-side, or in native apps.
From what I've seen, corporations push JS because there are a shitload of JS devs and most of them aren't old enough to know better, so they're dirt cheap.
The reason I look backward is that most developers have no sense of history. They just keep reinventing the wheel with new programming languages and frameworks.
web apps are the replacement for 98% of native apps, the browser functionality is just very new relative to the web's existence that the apps haven't been made yet. I make a bunch of these "native replacement" type apps and host them on https://apps.nektro.net/ with more to come. It's on github too. I'm also looking at web apps like https://mobile.twitter.com/ and similar.
We're never going to be able to agree because you seem to be OK with the notion of web browsers as virtual machines, and I'm not.
If I wanted to be tied to the internet at all times, I would use a Chromebook as my primary computer instead of a used ThinkPad running OpenBSD. If I wanted to do all of my computing on an incredibly powerful machine that somebody else owned and operated, I would get a free shell account on sdf.org or pay PANIX.com ten bucks a month.
I don't think web apps are new. I think they're just the latest incarnation of the dream that drove GE, Honeywell, and AT&T to invest hundreds of hours and millions of dollars in the development of MULTICS. Web apps, SaaS, and PaaS are just the old timesharing model of dumb terminals talking to a mainframe rebranded with new buzzwords.
I don't blame you for investing in web apps, but if they really are the future then you can keep it. Maybe I'm just paranoid, but I don't trust computers that aren't under my direct control.
Then again, you're almost stuck with using the browser as a hardware abstraction layer on mobile devices because Google and Apple pretty much ignored the entire history of Unix, POSIX, and desktop PCs when designing and implementing Android and iOS so that there is neither a common API, standard library, or programming language you can use to directly target both platforms that isn't the fucking browser, HTML5, and JavaScript.
How do would you solve the problem of app distribution in a way that's easily updated, easily shared, and cross architecture/os?
Don't distribute apps. Distribute source code. Write and document your code so that it can be ported to every platform for which there's sufficient demand.
This is why browsers like Chromium (the open source version of Chrome) and Firefox are available for Windows, OSX, hundreds of GNU/Linux distributions, FreeBSD, NetBSD, OpenBSD, etc.
It's hard to do this on mobile because Google and Apple don't make any effort at compatibility or interoperability. That leaves developers like you stuck using a web browser as a hardware abstraction layer. You should be able to code for mobile at a lower level, but that's unnecessarily difficult for you because there isn't anything like POSIX for mobile devices even though both Android and iOS are supposedly based on Unix via Linux and OSX, respectively.
Twitter is the perfect example of the problematic nature of websites as heavy JavaScript applications. No amount of dynamic content will justify the performance penalty and extra waiting for a single poorly thought out sentence. I just clicked a link to a tweet on a news story, and it took 15 seconds to load.
I do despise the current state of JS, though. I shouldn't have to get started with a bundler, transpiler, polyfills, and five different fucking package managers just to install the one goddamn library I need. This whole JS-everywhere mentality and over-reliance on bleeding-edge tech is just one giant catastrophe waiting to happen.
I agree that js isn't needed for everything, but native apps suck.
Well, yeah. JSON is a fairly standard data format for transporting data now. If you want the ability to efficiently reload data, or if you want a separate mobile application, or if you want to standardize your generated HTML to limit inconsistencies, then naturally you're going to want to move to a pure JSON setup.
The problem isn't JS, it's bloated JS. Pre-rendered HTML content can cause initial page load times to seem significantly higher than an empty page with some lightweight JS ready to make an Ajax request or two. By making the page seem more responsive on initial load, a user's patience isn't as quick to wear out, which gives you a bit more time to get that data loaded and rendered from the JSON endpoint(s). You can also put up a loading animation of some kind to alleviate the user's irrational impatience and make it clear to them that things are actually loading.
In other words, there are a lot of different incentives to move toward JSON + client-side rendering of HTML, and user impatience is one of them. It's often the case that JSON + client-side rendering is selected specifically as a solution to solve the user impatience problem.
For reference, the JS libraries I use unminified take up roughly 160KB. Minified, they take up just over 62KB total. Any JS I write on top of that is pretty much negligible. After initial load, those libraries are cached and every single request thereafter benefits from reduced bandwidth usage.
But, again, the problem is bloated JS. The reddit redesign is a good example of JS done very, very wrong. On my connection, the code I've written for the web portal I recently built has a total load time of around
2 seconds
and a total transfer of<180KB
for the unminified version, and<80KB
for the minified version. Compare that to an old topic of mine showing the reddit redesign's network load.In short, I agree that the state of modern JS is terrible, but let's not equate "modern JS" with "all JS", because some JS is good and, quite frankly, necessary for a functional web experience. I don't know about you, but personally I like being able to click "Vote" and not have the entire page reload just to submit that tiny request.
I think the issue isn't so much the raw filesize but the time it takes to execute the JS for a webapp. My website is 163KB but with 400KB/s + 400 latency and 6x CPU slowdown to simulate mobile it still takes about 3000ms to render. Without the network throttle it renders in about 1500ms, suggesting that at a minimum it's never going to render under that on a weak computer. If I had made the site server side instead of a webapp it would almost certainly be <100ms.
Maybe I'm wrong though. I'm pretty new to web development and it's only recently that performance came to my attention as a critical issue.
You're right that not all assets are created equally, and Javascript will be slower than CSS or images due to processing time. That said, it's probably the fastest interpreted language today. Modern browsers are JIT'd out the wazoo, and I've never found processing time to be a significant consideration versus transit time.
If you're seeing very large numbers, you may want to play with Chrome's Performance tab in dev tools. It offers a very comprehensive breakdown of exactly what your browser is doing in that time (eg. CSS repaints, layout calculation, JS events firing). If there's a slow function, this may help reveal it.
Here's a detailed (and long) article that goes into this topic in quite a bit of depth: https://medium.com/@addyosmani/the-cost-of-javascript-in-2018-7d8950fbb5d4
Well said, and I agree with you completely. It's entirely possible to write fast and efficient Javascript. This website is a perfect example of that fact.
It's of course worth noting that (most) developers are at the mercy of a client or someone higher up. I'm sure there are plenty of developers that would love to bring down page load times and website sizes in general by removing scripts, pop up windows, fancy scrolling, video backgrounds, etc. But clients today expect and request them. Why? Because their competitors have them.
You can try to persuade them away from the shit we find super annoying, but if you actually succeed in doing so and their marketing doesn't do as well as expected, they'll likely blame you for not adding in all that annoying shit from the beginning. Developers aren't purposefully adding in this crap - someone higher up with little to no knowledge of how a website is built is pulling the strings. And when they have a marketing department showing how well those little pop-ups bring down bounce rates and increase site interactions, it doesn't help your case.
And of course, even measuring the effect of the marketing effort requires adding in more scripts, allowing 3rd parties access to personal data and add in even more bloat. In the end, the bloat does absolutely nothing for the person actually interested in a site's content.
This is lots of logorrhoea stating what's obvious at the end of the day. Also:
Yes, you say in a footnote that this is a joke, but this is not a joke. If there's one original invention at the beginning of the Holocene, any invention after that one was not a "single" invention. Everybody stands on the shoulders of giants, making an inverse pyramid. But there is more to this website. The tone of the first few paragraphs is rather off-putting, and the navigation on the left has a link "about her". Why do I need to know your gender? The first lines of the text the author refers to themselves as an "old lady", but when I go to the "about her" page, they say they're 30-something. Since when has that become old? And why, in a talk titled "Dear Developers, The Web Isn't About You", Ublock Origin tells me that it's blocking 178 requests?
What happened to being serious and smart, and why did talking to a public degraded to condescending awkward "humour" sprinkled with fact-ish thingies in this industry? I'm glad I left it before it was too late.