An opinion on current technological trends
For a while now I am personally completely dissatisfied with the direction the (mainstream)technology is taking.
Almost universally the theme is simplification on end user facing side. That by itself would not be so bad but products that go this route currently universally include loss of control of the user including things I would not have believed would be accepted just a decade or so ago. Forced telemetry(aka spying on user habits), forced updates(aka forcefully changing functionality without consent of the user), loss of information - simplification of error messages to absolute uselessness, loss of customization options or their removal to parts that are impossible to find unless you know about them already, nagware and bloatware and ads forcefully included in base os install. And that is simply the desktop/laptop environment.The mobile one is truly insane and anything other "smart" is simply closed sw and hw not regarding user agency at all.
Personally I consider the current iteration of "just works" approach flawed, problems will inevitably arise. Withholding basic information and tools simply means that the end user does not know what happened and is dependent on support for trivialities. I also consider various hmmm, oops and such error messages degrading and helping to cultivate a culture of technological helplessness.
To be honest I believe the option most people(generally) end up taking of disinterest in even the superficial basics of technology is an objectively bad one. Computing is one of the most complex and advanced technologies we have but the user facing side even in systems such as Linux or Windows 7 and older is simple to understand and use effectively with minimal effort. I do not believe most people are incapable of acquiring enough proficiency to for example install an os or take a reasonable guess at what a sane error message means or even understand the basics of using a terminal, they simply choose to not bother. But we live and will continue to live in a technological world and some universal technological literacy is needed to prevent loss of options and loss of agency of the end user. The changes introduced in mainstream sw are on a very clear trajectory that will not change by itself.
I have this vision of a future where the end user interacts solely with curated LLM systems without the least understanding of what is happening, why it is happening or who makes it happen. The blackbox nature of such systems then introducing subtle biases that were not caught in brute force patches over the systems or simply not caught, perpetuating who knows what. Unfortunately I do not think it is sufficiently unlikely by the current trends.
Up to a point I get not wanting to deal with problems with technology but instead roadblocks are introduced that are as annoying to get through with the difference that they will not stay fixed. Technology is directing massive portion of our lives, choosing to not make an effort to understand the absolute surface of it is I think not a sound decision and creates a culture where it is possible to introduce disempowering changes en masse.
So far this has been a rant honestly and perhaps I just needed to vent but I am actually interested in the thoughts of the community on this broad topic.
I actually see your complaint happening every day in real time. The students I teach use laptops all day in school. Tech hiccups are rarer than you'd think with the sheer number of computers in use but they happen. And when they do, these kids take exactly zero seconds out of their day to attempt any troubleshooting. If I'm lucky, they will raise their hands immediately to ask me to fix it. Some of them don't even advocate for themselves and just sit there until I notice. To be fair, those kids are often using a bit of cunning and have a vested interest in not fixing the issue. They want to claim they couldn't do the work because the computer wasn't working.
I'll tell you the most common tech issue I have - almost daily in fact. We have protection and monitoring software on our computers and they won't connect to the Internet if it fails to initialize for any reason. Every single day I have at least one student raise their hand to tell me the internet isn't working. I walk over and see a giant popup that says "[Monitoring software] failed to launch. Click here to launch now and connect to the internet."
It has a giant blue button saying launch now. It tells them exactly what the issue is. And it pops up over their browser. It makes me crazy. I really think they're just used to their iPhones working all the time.
Now, a counterpoint to this would be that most people don't know how most things work. That's always been the natural progression of increasingly complex technologies. I don't do engine teardowns when my car stops working. I usually take it to the mechanic unless it's something very small and easy. But I have to imagine there was a time when people who owned cars had to know a lot more about how they worked. I barely understand how cars operate to be honest. I have a very general understanding of the principles and that's it.
I guess what I'm saying is that we all have limited hours in the day and unless your job/hobby is directly tech related, you don't need to have too much deep knowledge about it. Probably better to specialize in what you do need to know. My tax lady can just call the IT guy and focus on QuickBooks or whatever she uses.
I am very, very tech literate but I refuse to diagnose problems and troubleshoot on devices that I do not own and am not in control of.
When my work laptop starts getting flaky, I am more than happy to do a reboot. Above and beyond that, I send a ticket in to our device team.
See, in a corporate setting, most would generally advise not hitting big buttons that pop up over an internet browser.
I think that you might be fighting against a bit of malicious compliance, a bit of "not my problem", and a bit of typical school kid pranks rather than an inability to solve the problem at hand.
If kids are not allowed to own their devices, and not allowed to manage their devices, and everything is locked down, I don't know how or why they would be expected to have any problem solving skills.
I would totally understand not hitting the big blue button once, but students have had these computers for four years by the time they get to me - most have had this happen once or twice and should know the drill. They genuinely just don't even try.
I have a pretty good sense of the kids who are being maliciously compliant and the ones who honestly just don't know what to do. The well-mannered kids who never miss a homework assignment are the ones who are being sincere. The student who ate a donut he found in the bathroom a couple weeks ago though, yeah he's doing it on purpose. Actually, he has tried to pull a few fast ones on me by intentionally finding ways to make his laptop inoperable.
If he can find a not-so-obvious way to disable WiFi or cause a browser extension to misbehave, he will!
Not coincidentally, he is one of my favorite students because he's hilarious and his behavior is terrible while never being malicious.
It’s like a pencil, right? If your pencil breaks & you just… sit there for ten minutes, well that one’s on you my guy lol. Doesn’t really matter if it’s a school-provided pencil, or you can’t install games on your pencil—still gotta have it, preferably in operating condition!
That's my take. They want me to work on this stupid heavily restricted machine they've bogged down with obnoxious tracking and antivirus software? Fine, they can fix it whenever something inevitably doesn't work right.
At least my car still gives basic information on like "low tire pressure" or "low fuel" and not just "Oops, something went wrong" as many apps do these days. Just recently when Meta was down, Facebook Messenger just failed on the login with a nonsensical error message. Which led me, and many others, into thinking we had to reset our password. Would have been immensely helpful if the app could at least say something like timeout error or even 503 status code or whatever. I agree that we all live with a good deal of "blackbox" technology in our lives and no one can be an expert in everything, but at least give a proper error message.
But with your student example that seems to be an uphill battle. Which reminds of a another pet peeve of mine, with how often politicians will talk about the younger generations as "digital natives", as they are somehow more technically inclined than the older generations. Nothing could be further from the truth. They can swipe around their apps super fast, but the level of technological comprehension is not higher. I have heard from people working with IT security, that besides the very elderly, the younger generations are the age group that most often fall victim to various online scams and phishing.
I'm also passionate about computers, and I think it's a bit sad others, like your students, aren't, but my reaction to your anecdote is completely different from yours:
This is just bad design by the monitoring software company. When my car's tire pressure is low, the car can't just inflate the tires itself. However, it would be trivial for the monitoring software to restart itself automatically. It shouldn't be bothering the user, except maybe to show a notification that it is retying in the background.
I understand not having a deep knowledge about computing technology. I work in IT and I do not have what I would call deep understanding outside my narrow field of work, computing technology is immensely complex and complicated.
I am simply advocating for a shallow understanding, being able to diagnose simple issues, being able to orient yourself in a well designed unknown program, being able and willing to launch and adjust settings, being able and willing to follow troubleshooting advice, having the basics of knowledge of the os the user uses for some examples.
I do not think think this is unreasonable and I think it would lead to generally better society by making adversarial changes harder to push through.
To continue the car metaphor, it's also reasonable (in the days of ICE) to expect people to understand the basics of an engine. It would be reasonable to expect people to understand the basics of an EV.
Every single bit of knowledge needed to understand at a high level how an ICE car works was taught to you in middle or high school if you were paying attention. Ditto for an EV.
I wouldn't expect somebody who owns an ICE car to want to change their own spark plugs. But I'd expect anybody qualified to drive to understand what they are for.
I (mid-twenties), know that spark plugs exist, but absolutely do not know what their purpose is (without having to look it up).
If my car was having trouble, I don’t think I’d be able to tell if it was a spark plug issue.
Now, when my car battery died, I was pretty sure it was the car battery. And when I started my car to an extremely load roar, I was pretty sure it was the catalytic converter having been stolen (and it was).
They do the actual ignition of the fuel in the engine. No spark plugs, no combustion. Their timing is critical to the functioning of the engine, hence why it's important that they (and the wires that connect them to the rest of the electronics) are in good condition.
But the thing is....you knowing that other stuff easily puts you in the top 10% (arbitrary guess) of car owners. Like knowing what the registry is in Windows. I frankly can't remember what a catalytic converter is off the top of my head either.
And my point isn't that everyone should be able to remember all this stuff all the time. But that they should be able to have enough of a conceptual model that they can figure out how it 'fits in' with a few google searches. Pretty sure your reaction was "oh duh" and not a blank stare when you read that first sentence, as will I when I go search 'catalytic converter'. For computers though there's nowhere near that kind of conceptual model for the general public. And if computers (and integrated circuits) were not so utterly integral to damn near anything more complex than a screwdriver, it would be more acceptable.
But we have literal supercomputers in our pockets (an iPhone X could do about 400 GFLOPS, the fastest supercomputer circa 1995 could only do 170). Our USB-C bricks have more powerful computers than the Apollo missions, and the general public is at a loss if one of their buttons on the screen of their supercomputer is in a different place than it used to be.
That's not a reasonable thing to expect of people, I think.
It doesn't really make me a better driver to know how the car works [1]. It doesn't make me a better writer to know how my word processor works. My headache doesn't go away faster because I know ibuprofen inhibits my cyclooxygenase enzymes.
Using a tool effectively is not the same thing as understanding how it works. It can be, but it isn't necessarily true. I don't think it's very reasonable to expect everyone to have the time/energy/interest to delve into figuring that stuff out when they mostly don't actually need to.
If, as you say, you don't expect car users to fix their own cars - which is a perfectly reasonable expectation - why would you expect the same from computer users?
[1] This is where the analogy breaks down somewhat because you can make a case when it comes to cars that it gives me a little more control over the vehicle if I have once stripped down a clutch or if I have some handle on the ionic mobility of lithium - but in a modern car it matters far, far less than it used to (same same for computers).
No, but knowing that ibuprofen is an anti-inflammatory and acetaminophen is not helps you make an informed choice about which to take, no? The equivalent to computers right now is that somebody calls a doctor if they've got a headache.
I'm not saying "you need to know what every part of a car does in excruciating detail down to the exact specifications." I'm saying "you should have enough of a rough understanding of how your car works so if something stops working you don't get ripped off paying $500 for someone to change the air filter that doesn't fix the problem".
Saying "Your GPU is overheating" shouldn't be met with blank stares. People don't need to know how to make a GPU or exactly how it works, but they should have a rough understanding of what a GPU is, and why overheating is a problem.
And as I mentioned in my cousin reply, part of it is that computers are ubiquitous in a way that cars are not. The difference between your Nintendo switch and your cellphone is mostly just a matter of software. The difference between your PC, PS5, and macbook pro 2016 is entirely software (and iterative improvements). The only reason you can't legally have a unified interface between Amazon Prime, Disney+, and Netflix is because of laws preventing interoperability. Do most people realize this? That's the level of understanding I want people to have. "Oh the only real reason I can't have iMessage on Android is because Apple won't let people run it on Android."
Computer literacy is really that bad. Tablets are not considered computers with integrated screens....they're iPads (even when they're Android). People don't grok that the media center in their car is more or less the same stuff that's running their phones. And that's the level of understanding I want people to have. "Oh the only real reason I can't have iMessage on Android is because Apple won't let people run it on Android.
Absence of knowledge allows grifters to thrive. If most people understood that an NFT was just a receipt for a URL, I doubt the hype would have been able to liftoff the way it did.
Even my late-70s parents know to turn it off and on again. That's been a meme for decades now. They aren't calling the doctor for a headache. Knowing to reboot is a decent amount of knowledge to have because it fixes a lot of stuff. Especially with modern systems which are just so much more stable than in the bad old days. I don't get calls from my friends and relatives any more asking to fix the computer, because the computer has been fixed already.
Computers became ubiquitous by being able to fix themselves (also by getting cheaper and more useful, but still) which meant they required less knowledge to use.
See, I don't think that's computer literacy. I don't need to have ever even used a computer to understand NFTs/crypto/etc are scams if I have good skills at interpreting information. The work my wife does with scam-prevention groups don't have any focus on computing devices, it's entirely about understanding information.
Information literacy is incredibly important. Computer literacy, meh, not so much.
To go back to the car thing, computer literacy lets you fix a broken wiper motor; information literacy lets you avoid crashing into other drivers. I'd much rather the majority of drivers put their available skill points into the latter than the former!
I'll put it this way, especially in light of trying to reduce populations. The numbers are made up, it's the concept that matters.
If only 1/1000 people will bother to learn computer literacy, if we reduce population by 10%, what happens? 10 billion people to 9 billion people for sake of easy math. We go from 10 million people who can potentially be computer literate to 9 million. If we need 9.5 million computer literate people to maintain computer systems, we've hit a massive problem, and need to onboard 500,000 people in a hurry.
If we've conditioned that 1/100 people "can" learn computer literacy, we'll never run into that problem. And because computers are so utterly integral to everything everybody does now...it's probably for the best that we teach computer literacy the same way we teach general reading and math.
Heck, I think we hit a major regression when most high schools ditched wood shops and metal shops. It's fostered this level of helplessness for things that are not terribly difficult, like repairing a wood chair.
Um. That's a very unexpected context to put things in. If a billion people disappeared overnight I think we'd have far more issues than who is going to update the world's printer drivers. I'm not entirely sure I understand how that's relevant.
On that basis all vocational skills should be taught. Which is obviously ridiculous because people would spend their whole lives in school - there's too many things to learn. The whole point of school is teaching the basics so people can specialise later.
Every job/hobby needs maths and english to some degree, but only people who want to maintain computers for a living needs to know how computers work. Heck - I've worked with programmers who have no idea what a filesystem is, and they were perfectly good at their jobs.
Why is there a button?.. If the fix takes a single button press, that probably means that it should take zero button presses. Restarting the app is always what the user would want to do, I don't think there can be a case when a person thinks "oh, I glad this software crashed, this is actually exactly what I wanted, so I won't restart it". Why doesn't the software just restart automatically if that's the obvious thing to do?
Yeah, I mean you're absolutely correct. But still, it's not like it's a big ask for kids to just hit the button once in a blue moon.
They do other stuff too, like not thinking to restart their computer if something is wrong, not considering to check other file folders if they can't find an assignment they know they did. 9 times out of 10 I find the file under downloads or documents when they thought they saved it to OneDrive.
If I had to guess about the launch button, I would assume it's a workaround to some limitation imposed by Microsoft or my school's network settings. Perhaps they don't want apps attempting to automatically relaunch multiple times for one reason or another, though I can't begin to imagine the specifics behind it
It's probably a bad idea to allow an infinite loop. It's probably not 100% certain that clicking the button would fix it.
My complaint is meeting relatively minor, but does somewhat mesh with what you're saying, which is "cutesy" messaging. Stuff like,
"Hold tight, getting stuff ready!"
"Whoops! Something went wrong!"
Etc, etc
I know average users don't care about what's happening, but this stupid, "relatable" corpo speak drives me up the wall.
Discord and their fucking Wumpus.
I truly hate the "it's your lucky day" message I get every time I try and open Discord and find out there's a new necessary update to install, that I will need to manually download and install.
I get the need to update regularly, and that's not in of itself a problem, but the "it's your lucky day" message feels so snide and unwelcome. It is not my lucky day, my day is not somehow magically improved by you releasing a new version, especially when you only send me that message to let me know that I need to do work and wait around for the thing to be installed in the first place.
It's funny the relationship we have with technology. In contrast, I see many people in the FOSS world cheer the availability of updates as free new features.
Yeah, but in the FOSS world you usually get to choose which updates to install, and when to install them.
To be fair, most stuff we complain about are live services, we can't really expect them to let us pick and choose updates. Although I would love a, say Discord LTS, or Teams LTS, that only gets security updates.
But that doesn't generate money, so it'll never happen.
Yeah, every live service is going to force version matching to some extent between client and server. As an enterprise customer, LTS teams and other enterprise platforms would be great.
I suppose what's funny to me is having any software that has aspects you truly hate but continue to use, outside of things needed for work or work like activities.
I also dislike discord, which is why I run my own teamspeak server since forever. I realize there is a lot going on on discord as a defacto community hub, but if folks "truly hate" it, its not like alternatives don't exist.
I commented somewhere else, but I don't hate Discord overall, I find it very useful and it's where my friends are, so I use it. I just hate that one message, and the UX decisions behind it.
LTS without any extra enterprise pricing would generate negative money. An extra stable release channel would be a huge hassle.
It's not the update themself — I agree that that's cool. It's that when I get an update (ironically, I just got one now), I'm usually opening Discord to chat with a friend, and instead I have to open my browser, download the update, wait for the download to finish, apply the update, wait for the update to be applied, restart Discord, wait for Discord to do its own updating mechanism, etc.
In other words, when I get the "it's your lucky day" message, it is not my lucky day — I am being prevented from doing the thing I want to do right now. And that is completely incongruent with the message, which is cheery and positive, and telling me I should be happy, even as its presenting an obstruction to my path.
To be clear, I completely understand the technical side of this. I am not generally opposed to them forcing me to apply an update every so often before I can start Discord. (It would be nice if they let me apply updates in my own time, but I understand that this makes things technically more complex, particularly for an application like Discord that is fundamentally connected to other servers and users.) If the message was purely informative ("you need to install a new update to continue", or even "we're sorry, but you need to install this update first") I would still be put out by having to do the update, but understanding of the situation.
But instead, a chore that I have to perform to do the thing I want to do is presented as a positive thing, and I find that UX decision particularly irritating and abrasive.
Also instead of being able to say no, it's always "not right now" or "remind me later". No, the answer is definitely no and don't ask again!
I think you've identified the crux of the issue by calling out that it's mainstream technology driving this trend. Opaque, user-agnostic technology black boxes are not an accident, they are the adaptation technology has made to go mainstream in the first place. Having any technological barrier to entry, even a minor one, places drastic limits on the spread and scope of any development.
It's probably fair to say that most people would be better off taking an active interest in their tech rather than just passively consuming it, but I could say much the same about pretty much any aspect of their lives, from their nutrition, transportation and education right on up to their local politics. The hard truth of the matter is that you are running up against an entrenched cultural disinterest in the fine details of tech, and any change to that would have to happen in the cultural sphere first.
A demand for technology that is open, transparent and configurable requires an interest in such technology, and most people simply aren't interested. When tech was niche and immature, it was marketed primarily to those people who were most interested. You're noticing the divide between products marketed to the two different groups.
A cannot disagree with the main point but I do argue that more people should take an interest because the trajectory we are on right now with market-insignificant minority having any interest is what it is.
Personally I would consider a limit on rate of spread of example smartphones say a decade and half ago a blessing, they are absolutely amazing technology used in appalling ways, having bad effects on too large a segment of their users and pushing standards downright terrible for end users.
I'm not disagreeing, but the motivations of the companies in question pushing smartphone tech to the masses did not really take anything into account except how to maximize market share, since whoever captures the market share early on in the segment's creation will have massive structural advantages going forward due to lock-in. Therefore - smartphones were as universally usable as they could be. Companies that tried to target individual segments of power users or more tech-savvy sorts either died out, were acquired or became niche boutique players. Money is the driver, and the money is in the masses.
I have agreed with that position for a long time but I don’t entirely think it’s correct anymore.
You may have at one point heard of the term commodity computing. The idea is basically to make computers ubiquitous. What people
In tech have been doing to make this a reality, for the most part, was to either open up the specs or to make computers very cheap. They did that; computers are dirt cheap these days, and everyone owns one in one form or another. But if we look at the history of the subject, they were fundamentally misguided. Price was important too, yes, but far more important was utility.
A can opener is technology, too. Imagine for a moment they were as difficult to use as computers were in the 1980s, where you could easily mess up the can opening operations and you needed special training to even understand how to use them. Do you think canned food would have become ubiquitous if that was how things worked? No; people buy things because of utility, and utility is diminished or erased by being by difficult to use. Computers were much the same.
If I am a painter, what is the reason why I would want to put away my paints and start working digitally? There are many benefits to working that way, but there are many drawbacks as well, and part of that is that there needs to be an investment that tends to be invisible to techies: learning how to use the computer, operating system, and painting program, plus they have to relearn all of their painting skills to get the results they want out of it. People are smart; they know about that cost and will measure it when deciding to make the investment.
The problem is that everyone in technology vastly underrates the investment you have to put into learning them. Techies like you and I don’t have a good grasp of all the time we put into it and we have the gall to tell them it’s easy. We dont even completely understand why they are upset most of the time; people aren’t upset that they don’t know how to clear an error, they are upset that there is an error there to begin with. I’d argue that the entire reason why proprietary software dominates the world is because they spend their resources making things as easy and painless as possible; literally all of them market their product based on those factors.
Rounding back to the idea of commodity computing, the thing that made computing ubiquitous was the creation of the modern smartphone. To put things simply, the things you are complaining is simply the way of the world. To make things go your ideal way would be to change humanity in some very fundamental ways.
One thing that I think is worth noting is that software that “just works” can also meet the needs of the technically inclined. The earlier half of versions of Mac OS X (up through around 10.9) were pretty good at it.
The trick is progressive disclosure, which is a type of design where novices naturally only encounter the bits that they can comfortably handle, but as their knowledge and capabilities increase, progressively more advanced functionality in the software becomes apparent. This is a great way to design software because not only does it accommodate users at both ends of the spectrum, it tends to bolster the technical abilities of those who otherwise wouldn’t be all that interested at becoming “good at computers”. This is part of why if you talk to a photographer or graphics artist who’s been using Macs since the early 00s, they’re likely a walking compendium of Mac OS keyboard shortcuts and can probably tell you a thing or two about automating tasks with AppleScript, which is treading into programming, even though they probably don’t consider themselves programmers.
This requires developers to place a great deal of care and consideration into the design of their software however, which has become increasingly uncommon. These days software design is driven by analytics numbers (which can be used to justify just about anything if presented correctly) and marketing. Even Apple has been losing its grip here.
Unfortunately this is not something that’s yet been embraced by the FOSS scene. Things still tend to be pretty binary there, being oversimplified with all the corners sanded off (e.g. GNOME) or pretty squarely aimed at technical users (e.g. KDE). Progressive disclosure is hard to come by there.
Up to a point, I do agree with some of the things you are voicing. The option should be there to go in and do things yourself.
At the same time, technology getting simpler to use is nothing new. You see similar discussions about cars from car enthusiast. And they also have a point, at the same time a lot about cars has gotten simpler and more reliable over the decades.
This is just one example, but is generally true for most technologies. And for most instances, you will find experts in the field being overly annoyed by the lack of interest end-users show about the inner workings of something.
I also do feel you are conflating a lot of separate things into one meta subject. Possibly too much. For example, forced software updates can be an annoyance. But, they are also an absolute necessity in the modern software landscape from a security standpoint.
Of course, security updates shouldn't necessarily mean that functionality changes as well. But realistically speaking it is a lot to ask from developers because at some point new functionality will be developed or changed. Maintaining security updates for each past version where functionality changed might be asking too much.
Lastly, giving users more options also means more ways for them to mess it up. Which is a real pain to deal with on multiple levels. So from that perspective alone, I can understand the push towards simpler software with fewer options and less user facing errors. It still frustrates me from the perspective of a responsible power user. But I fully understand it from the perspective of having had to deal with irresponsible "power" users or users taking technical errors out of context only to cause confusion.
Though, I do think that many companies have gone too far in trying to shield their users.
Again, I do share your frustration in general. I hate the direction of having less information available. It is just that I also saw some room for a bit of nuance or more context.
I simply think that the simpler and more reliable is possible with user respecting design.
As for the security, I agree that there the updates are important and should be installed. It is also one the subjects where I would personally consider opt-out acceptable, but that opt-out should be there because otherwise adversarial functionality is easier to push.
I do not think it is coincidence that downright user hostile patterns in desktop space appeared en masse after normalization of forced updates on the dominant platform. In Windows 7 it was just about perfect, updates by default, scary warning to check manually or opt-out.
As for the last point - I simply think that a minimal level of general tech aptitude would actually reduce need for support. Having a simple reliable sw is good, but I simply think that right now there is only exchanging technical difficulty for artificial barriers.
I actually don't think having software simplified and un-configurable is a bad thing for mass amount of users that have zero interest in learning about technology.
What we should rally against is having that kind of technology be mandatory in our daily lives. There should always be an option to use controllable software for those that want or need it.
Overall good take, I can't really say I disagree with any of what you've said. Obviously there is a need for certain things like auto updating software, it makes sense to have the latest security patches, but this should be an opt-out, not something I'm forced into. Part of the reason I switched to Linux was because of weird forced defaults from Windows and being able to actually own my device - If I want to break it, let me.
Whether it's a good idea to have this same approach for end users is debatable, but I really do think that there may be some benefit to allowing end users to break things if it's the most effective way to learn.
This rant is as old as time and doesn't really change.
The simple fact is people don't want to screw with their tools, they want them to work. Very few people want to know how to make a hammer, they want it to drive nails. Same goes with art, tech, cars, business, whatever.
Further, most people don't have TIME to learn all this stuff. So while I do think education on underlying fundamentals and just learning what your stuff can do (the amount of shit in iOS alone that is intuitive but never explained and well within the average users ability to take advantage of is absurd), I also think that in general people have unrealistic expectations.
It's like saying everyone should know how to draw or paint or do public speaking or whatever. All of it requires time and dedication, and most don't have the time or the dedication for that, nor should they be required to.
Cynically, it's more profitable for companies if people don't (want to) care how things work for themselves. Forcing a kind of learned helplessness with "oops" error messages and no actual cause/steps to take encourages people to take things back to the shop for help.
They can sell you support, or better yet, a subscription to a service they promise will keep working... So long as you don't mind not owning anything you save there and it mines your behaviour to sell to the highest bidder.
It's the same reason companies are fighting right-to-repair so hard. If they make something durable that you can fix something yourself, how are they going to keep making more and more money off you? So they implement DRM and stop trains running if they get repaired by an independent shop (this was a real story).
It's an increasingly fragile situation. For example, big game engines like Unity got started because they were able to hire people who knew how to make their own game engines. Now they're successful and fewer people are writing game engines, where are places like Unity going to hire the next generation of engine programmers from? Now extrapolate that to every industry.
I don't know what the solution is, except to simplify in a more productive way, like replacing generic software libraries with bespoke code that's faster/easier to debug, or reducing the amount of software we use, like an IoT fridge that's a just security vulnerability with a 5 year timer.
I do agree with a few aspects of this around services becoming more of a black box, especially with annoying and vague error messages...
... however, I do have to disagree with you STRONGLY about automatic updates and forced updates, etc. I've been in IT a long time, specifically in client services so I deal with people and how they interact with computers every work day. The world is so much better for forcing people to update. Sure, it might annoy them, but the number of viruses and other malware I see on a normal basis is a drop in the bucket compared to 10 years ago even. As it is, people will nudge back the updates over and over again until the OS just says "sorry, we can't delay this anymore", and you know what solves most of the problems I run into? Applying updates and restarting. You can't tell me that users don't have time in their life to let their computer go update for awhile and do something away from the computer.
Additionally, maintaining software gets exponentially harder when people of all sorts of different versions are trying to work together. Do you remember dealing with all sorts of compatibility issues between Word versions? Or Photoshop? I could go on and on.
I dislike that this means a lot of companies have switched over to a subscription based plan, but at the same time I can understand that maintaining security patches and updates does cost a lot of money in some cases so I can't fault a company for pivoting to something that allows them to continue to exist. If only there weren't a lot of companies that abuse the subscription concept :(
And my final point would be, in all of my experience in IT it has forever been the case that a percentage of users will just give up on dumb problems without trying for anything themselves. This hasn't changed. And you have a confirmation bias because you probably don't see all the people that troubleshoot the problems themselves because they often can fix it. So keep your biases in check here. There are still some very smart people out there that aren't in IT but still manage to get by quite well in the same way there are still people who barely know about cars but still can do basic maintenance like topping up fluids and changing air filters.
I have a few thoughts. The first one is that it takes all kinds. I was very much into technology at a young age, and built my own computer from the NAND gates up. Built my own ALU, implemented my own OO compiler, network stack, and rudimentary OS.
There are people who have that same level of interest in bugs, or cars, or music, or the mathematics of barnacle formations (a former finance analyst I met in grad school). To all of those folks any time spent dealing with technology is a distraction from what they would rather be doing, even when the technology is an enabler of that activity. They want the technology to be passively functional behind the scenes and otherwise stay out of the way.
And while my own inclination to be self sufficient would argue that people should learn a little more just to be more self sufficient in a digital world, I also recognize that most people don't share that view either. So the technology companies meet people where they are.
My other thought is that I might need examples of the things to understand the problem. For context, in my home I have three windows computers, two Linux computers, a bunch of Android devices, my own hardware firewall, 3 WAPs with controller, and a fleet of Linux servers in AWS. I use Microsoft 365 Family for access to office tools and OneDrive. I use iDrive as a secondary backup, and 1Password family for passwords, etc. I also remotely manage the border and WAPs for family across the country, and use various monitoring tools. In my professional life I pivoted away from direct technical roles after I was director of engineering, but have a long and storied tech career.
So I experience a broad cross section of technology. In all of this, I really don't encounter many problems. Things mostly work, and when they don't, I usually see some error or message or indicator that feels appropriate to the tool and the audience. When my Linux volume fills up I cat the logs and see something mentioning space. When OneDrive can't connect it usually says so and suggests I check my Internet connection. Admittedly, trouble shooting my docker containers in AWS gives me some messages that require some deep knowledge to parse.
But overall, everything is great? I mean, compared to where I remember things starting. Troubleshooting natural calls to C runtimes in OMVS segments on a zOS mainframe, learning job control language, moving files between big endian and little endian systems, etc, things are amazingly better. Each message might not give quantitative details, but they usually give qualitative information, like I can't connect to the Internet.
Someone else's comment also felt quite on point regarding fixing other people's technology. They used the example of work computers, but I would extend it to managed platforms and services. Their support exists for a reason, so I put the work on them when possible and free my time for that which I'm the only one who can fix it.
I don't know if that is relevant to your experience. But I just don't have the issues with technology that many folks seem to have, or routinely get unhelpful error messages.
Big agree. I can appreciate that there are people who just want to turn their internet machine on and go to facebook or whatever, and i think that's fine. good for them. But i also want there to be a toggle that says "i want to control my device myself, and if i fuck it up, i'll unfuck it myself" (kinda like what MS tried to do with S mode, i guess? Like the device was in S mode by default, but you could turn it off. I want that, but for everything. Telemetry, updates, features, etc, like you said)
My issue is less that technology is being simplified, and more that it's being simplified and we are not given a choice about it. I completely understand that my work computer might need to run in the hypothetical idot-proof mode, because that would make my colleages' jobs easier. but at home? on my personal device? on my own internet connection? i should be able to do what i want to that device and not have literally everything break (see, if i want to play minecraft on windows, i'm forced to have telemetry enabled, windows update enabled, and a bunch of other little things that i really don't want enabled, because minecraft now talks to all the same ip addresses as microsoft's spyware addresses.)
And sure, i can (and have) gone over to linux for a lot of stuff. I can play minetest instead. But the fact remains that that's not going to fix the overarching problem of "companies are treating me and my data as the product and i get no say in it whatsoever if i want to participate in modern standard internet/technological/online social life." None of my friends are technical. I can't tell them to play minetest with me, or set up an xmpp server and have them chat with me there instead of Discord. Tildes is great and i really love it but it's also small and doesn't really replace a lot of the subs i used to frequent on Reddit which were for niche interests and upbringings. you know? So my and other people's option is, in my opinion, be the product and interact with my friends/be able to exist on mainstram social spaces, or don't be the product and be cut off from the mainstream.