What's the big deal with Electron?
I thought about asking this over here but didn't want to potentially derail the thread.
As a lay user, I honestly don't know what Electron is and couldn't tell you whether or not a particular app/program uses it. However, hanging out in techy spaces has helped me understand that people have some very strong opinions on it, often viscerally negative ones.
Think of this like an r/ELI5 thread: what is Electron, and why is it so polarizing? Many companies seem to be choosing it, so it seems like there's some major benefit to it, but many well-informed people, including lots of people in software development, seem to absolutely hate it. What's going on there?
The reason why Electron rubs some software devs and especially end-users the wrong way is essentially its inelegance. The ELI5 analogy I’ve seen in the past is this. Think of your OS as your living room. Think of the objects that you put on your coffee table as the apps that you run on top of your OS. Think of the coffee table itself as the set of frameworks afforded by your OS that your apps run on top of.
Under this analogy Electron apps force you to run each one on a separate coffee table. If you just run one Electron app, that’s not really that cumbersome. But, if you have half a dozen apps that are all based on Electron, that’s like having half a dozen coffee tables in your living room—one for your TV remote, one for you lamp, one for your book, one for your keys, one for your sunglasses, etc. And, unless you have a really big living room (analogous to having a lot of RAM), you may run out of space to put all these single-purpose coffee tables.
Instead of letting your apps live on top of the native frameworks provided by the OS, Electron apps come with an entire browser (Chromium) that is bundled independently of all the other apps you install. This inelegance of not sharing resources is the main reason I prefer to use native apps over Electron apps.
"Techy spaces" often self-select for strong and usually negative opinions. Electron (and Firefox, ironically enough) frequently receive unjust hate.
Electron is a framework that lets you ship web applications as desktop applications. Companies love this because you just have to write one application, and it'll run on all (non-mobile) devices - MacOS, Windows, Linux, Chromebooks, you name it.
(It is slightly more complicated than that. But not very.)
Many independent developers also love it because to be honest, HTML+CSS is far and away the best / most enjoyable method of creating graphical user interfaces. System libraries (especially on Linux, although I haven't tried Qt) just don't stack up.
This doesn't come without drawbacks, of course. The most noticeable drawback is that Electron applications don't look right. Every Electron application uses different spacing, different fonts, different sets of icons. While your mail app and your file manager might share the same general placement of action items are stylistic choices, Spotify, for example, throws this design language and integration out the window. This is personally my biggest gripe with Electron - I like my computer looking nice, and disregarding the native graphical toolkit is an easy way to ensure inconsistency.
It's particularly noticeable when you're using multiple Electron programs. My development environment is Electron (Visual Studio Code). My music player is Electron (Spotify). My chat client is Electron (both Discord and Element). My documentation is essentially Electron (an HTML page, opened in Chromium). I do have and use native alternatives to these sometimes (Gedit, Celluloid, Hexchat, books / neovim, spt, irssi, man), but it's unfortunate that almost all the developer effort is focused on the Electron behemoths.
Security concerns around Electron (for shipping a whole browser) are the third reason some people don't like it. Those are mostly FUD at this point.
Electron is a tool which allows you to bundle a portable copy of the chromium web browser with a single web application, as well as allowing you to interact with the native system (e.g. modify files) in a manner which normal web applications aren't able to.
It's popular because the web is popular; and because there's a perception that graphical web applications are in some respects easier to make than graphical applications using traditional architectures.
It's unpopular because it tends to use a lot of memory; because there's a perception that non 'native' apps are inferior; and because there's a perception that the web as a whole is a failed platform.
As a web developer I'm curious - do you know why some people think that?
I think a lot of the thoughts of the web as a failed platform come from social problems (advertising, tracking, social media) which aren't relevant here, but off the top of my head:
(disclaimer: I am not in that camp)
in many regrards, it's a correct one. While I personally hate web stacks, they have by far the most mature, expressive tools for UI/UX development. And have tackled problems that many other applications still crumple over to this day, like responsive design and multiple formats of interaction (mouse vs touch screen vs mobile device). And of course, web tech is some of the most portable tech as well (which is always a pain that is worth getting around, despite compromises).
granted, throwing an entire web browser into your application for this is overkill 90% of the time (and the most resource hungry one at that), but the idea to adopt lessons web has learned into other kinds of apps makes sense. I would hope in time a competitor (or electron themselves) can strip this down to the bare essential rendering engine required for processing HTML/CSS/JS, but jury's out.
Of course, the downside for power users is less readable source, but that's the case for most platforms that compile their applications. The obfruscation definitely has a coporate value, but ultimately it is done for performance reasons.
I'm pretty sure most people would agree that web technologies are fairly successful on the web.
This is becoming less and less true and I greatly look forward to the day professional software starts running on the web.
It's funny - we started at dumb terminals connected to mainframes, and we're headed back to very smart terminals (but still mere terminals) connected to remote servers.
Electron is a desktop GUI framework that embeds a copy of Chromium. You write your GUI as a single page web application and the embedded Chromium renders and manages events for as if you were on a website, but with OS level APIs exposed that websites wouldn't have access too. It's quite popular for a handful of reasons
Part of a holy grail of codebase commonality - if you use React for your webapp, then you get iOS/Android for free with React Native and Linux/Windows/macOS for free with Electron.
Cross platform desktop GUI space is a mess - GTK has the good ol' unhygienic APIs C programmers made back in the day and doesn't look particularly native on half the platforms regardless, Qt has licensing FUD + far less bindings from being in C++
Flexibility - it's quite easy to whip up a custom "widget" in Electron, or make use of the millions available for each of the frameworks on npm. On the other hand, trying to get custom behavior on Gtk or Qt widgets feels like pulling teeth
Of course, the cons are that there's a high resource floor since, no matter how simple your app is, you still need to bundle and run chromium with it. They will also likely not feel or look particularly native (although that's something you can fix with some elbow grease: see https://getlotus.app/21-making-electron-apps-feel-native-on-mac as an example)
Others have laid it out pretty well. I'll just add that it also increases the hegemony of Google and their web technologies. The more developers get used to only writing for Chromium, the harder it will be to find developers who can make web sites work well with Firefox, Safari, Vivaldi, etc. and we end up ceding what should be open standards to a single company.
I agree. Chrome's ~2/3 global market share and the way they can effectively force other browsers to adopt what they adopt for fear of being made "inferior" and left behind is a problem, and it's a really tough one to solve. I read an article recently talking about how the push for ever-more-and-newer web APIs is at least partly driven by a sort of power struggle between app developers and web developers at Google, with the web developers pushing for the browser to expand wider and wider so that web-apps can compete with apps on functionality.
Also, if I may be pedantic for a moment, Vivaldi is built on top of Chromium so Vivaldi users should be fine there unless Vivaldi is adding some sauce on top of the engine that Electron-accustomed developers won't be accounting for (which is entirely possible).
People who believe this probably don't remember what web development used to be like. It used to be very likely that you would have to come up with nasty workarounds for IE6, for example, requiring your whole app to be restructured. JQuery was popular because basic DOM manipulation API's varied so much.
These days, Safari is probably furthest behind and it's a very capable browser. The new API's I see added in Chrome release notes are increasingly obscure and most web developers can safely ignore them.
Maybe developers are slacking off on cross-browser testing in part because much of the time, it just works?
The time where web standards were publicly worked on and there could be many implementations of it, but that time has long passed. I was incensed when someone else posted this link because it's basically a list of "standards" that are not only extremely arbitrary but will most likely only ever work on Chrome.