I guess this is a sort of mood piece, bringing together a lot of unrelated flaws and asking why so many mistakes and accidents happen? Taking a really long perspective, I'd say that risky...
I guess this is a sort of mood piece, bringing together a lot of unrelated flaws and asking why so many mistakes and accidents happen?
Taking a really long perspective, I'd say that risky technology goes back to the invention of fire. To hit a few of the highlights: farming and ranching is dangerous work, whether you're working with large animals or heavy machinery. Electricity kills a lot of people; a lot of important inventions had to do with making it less likely to kill you or burn the house down. So do trains; there is a similar record of safety devices so that somewhat fewer people get killed every year. Early cars were obviously a menace to public health, both in obvious accidents and more subtle lead poisoning. Early airplanes: extremely dangerous, suitable only for daredevils.
Over the years, safety records improve. The airline industry actually has better safety than just about everyone, in part due to being heavily regulated. The saying is that the regulations are written in blood.
Fatal accidents are pretty dramatic and do drive society to make big changes, though it's often slower than you'd think. With fraud, though, very often losses are accepted as part of doing business, as long as it's at a predictable rate. Meanwhile, your desktop machine crashing or getting a virus doesn't usually kill anyone; people put up with it.
Still, Windows boxes don't crash like they once did daily or sometimes hourly. There are speedbumps slowing down the spread of viruses so they're not as common. Critical security flaws sometimes get caught before they're exploited (as far as we know). Due to frequent, automatic security updates becoming common, desktop and laptop computers are probably better off than many of the other dubious devices people are routinely buying and installing.
The current mood seems to be less accepting of risks as they become less common and less familiar, but more newsworthy? Our standards are getting tougher, and then we're pretending our standards were always this high, and the various failures we see all around us are somehow unusual.
The risks of relying on Windows (and Microsoft generally) are not hidden. They’ve been exposed in the media and every single user’s personal experience for decades. The problem is that there is no...
So when I say Boeing plane crash, I’m using a metaphor, but it’s closer to real than you might imagine. The problems I’m going to detail that led to this breach are rife throughout all of software development. I mean, even Boeing’s 737 Max catastrophic failures came from a software patch!
While the Windows breach probably didn’t affect your machine, it could affect your life. Breaches of Windows or underlying operating systems, or just bad software development practices regularly cost time, lives and money. Sometimes it’s inconvenient, like when airline reservation systems go down and cancel or delay thousands of fights. It can be annoying, like in 2016 when AirBnB, Paypal, Twitter and Reddit went down due to an attack by infected ‘Internet of things’ systems, or when hackers took control of tornado sirens near Dallas in 2017 and set them on and off between 2:30 and 4:00 AM one morning.
Sometimes it leads to identity theft after hacks of financial institutions; there were 3,494 successful cyber-attacks on financial institutions in just the first six months of 2019. It can cost lives; in 2017, 16 hospitals in England shut down all non-critical operations because of a ransomware hack, as did utilities in Spain. And it’s a critical national security problem, with Windows XP embedded deeply in, say, nuclear submarine control systems.
The risk is hidden and thus the problem seems out of sight. But that just means we’re in the early part of the theme song from Jaws.
The risks of relying on Windows (and Microsoft generally) are not hidden. They’ve been exposed in the media and every single user’s personal experience for decades. The problem is that there is no accountability when such risk is undertaken by managers and that risk cascades into failures. Who makes the decision to purchase and run critical operations on top of the Windows platform? When the failures occurr, are these decisions reevaluated? Are meetings convened to discuss shedding Windows? (It does happen occasionally, but apparently only when Microsoft decides to change its pricing by an order of magnitude. And this isn’t necessarily a representative case, as most of the actual research and scientific compute going on at CERN wasn’t running on top of Microsoft software/services anyway. I.e., CERN was primarily concerned with services like email or video conferencing. Not that such functions aren’t important, but they are the kind of thing that is highly likely to be sourced from an off-the-shelf solution. I’m not suggesting that anyone should eschew MS Exchange or Skype for home-grown mail servers or video-chat solutions.)
You could replace Windows with any other <commercial software> and the same issues would apply. But, Windows is abundantly adopted and its issues are also abundant. Part of the problem is there simply was no commercially viable alternative to Windows for a decent window of time. If you wanted to use an alternative OS from Windows in a business context to run point-of-sale operations or just have personal computers for employees doing non-technical work, you’d have to hire and maintain an IT infrastructure, including personnel, that were familiar with the alternative. And Microsoft’s monopoly position would make such a proposition expensive. More expensive than the cost of operating as a Windows shop? I’d argue, in the long run, yes. But, since when do managers and executives making decisions about fundamental software infrastructure think about the long-term? (Besides, in 10 years, the tech landscape will have changed anyway—we won’t still be using Windows then, right?)
Like most American industries, software today is comprised of large corporations focused on financial engineering, mergers and acquisitions and managed revenue growth. Marketing and "strategy" drive product decisions rather than the reverse. Legal and lobbying machinations take priority over technical innovation. In fact, we are so overexposed to this reality that even the jokes about how managers and leaders at tech companies are out of touch sound stale and overused these days - Dilbert and "Office Space" nailed this entire category of humor decades ago ("Office Space" came out in 1999!).
What’s important to understand is that, while there are always going to be some defects, most of these errors and vulnerabilities are not inevitable. They are not a result of technological problems, they are a result of corrupt business models induced by bad public policy around markets.
I guess this is a sort of mood piece, bringing together a lot of unrelated flaws and asking why so many mistakes and accidents happen?
Taking a really long perspective, I'd say that risky technology goes back to the invention of fire. To hit a few of the highlights: farming and ranching is dangerous work, whether you're working with large animals or heavy machinery. Electricity kills a lot of people; a lot of important inventions had to do with making it less likely to kill you or burn the house down. So do trains; there is a similar record of safety devices so that somewhat fewer people get killed every year. Early cars were obviously a menace to public health, both in obvious accidents and more subtle lead poisoning. Early airplanes: extremely dangerous, suitable only for daredevils.
Over the years, safety records improve. The airline industry actually has better safety than just about everyone, in part due to being heavily regulated. The saying is that the regulations are written in blood.
Fatal accidents are pretty dramatic and do drive society to make big changes, though it's often slower than you'd think. With fraud, though, very often losses are accepted as part of doing business, as long as it's at a predictable rate. Meanwhile, your desktop machine crashing or getting a virus doesn't usually kill anyone; people put up with it.
Still, Windows boxes don't crash like they once did daily or sometimes hourly. There are speedbumps slowing down the spread of viruses so they're not as common. Critical security flaws sometimes get caught before they're exploited (as far as we know). Due to frequent, automatic security updates becoming common, desktop and laptop computers are probably better off than many of the other dubious devices people are routinely buying and installing.
The current mood seems to be less accepting of risks as they become less common and less familiar, but more newsworthy? Our standards are getting tougher, and then we're pretending our standards were always this high, and the various failures we see all around us are somehow unusual.
The risks of relying on Windows (and Microsoft generally) are not hidden. They’ve been exposed in the media and every single user’s personal experience for decades. The problem is that there is no accountability when such risk is undertaken by managers and that risk cascades into failures. Who makes the decision to purchase and run critical operations on top of the Windows platform? When the failures occurr, are these decisions reevaluated? Are meetings convened to discuss shedding Windows? (It does happen occasionally, but apparently only when Microsoft decides to change its pricing by an order of magnitude. And this isn’t necessarily a representative case, as most of the actual research and scientific compute going on at CERN wasn’t running on top of Microsoft software/services anyway. I.e., CERN was primarily concerned with services like email or video conferencing. Not that such functions aren’t important, but they are the kind of thing that is highly likely to be sourced from an off-the-shelf solution. I’m not suggesting that anyone should eschew MS Exchange or Skype for home-grown mail servers or video-chat solutions.)
You could replace Windows with any other <commercial software> and the same issues would apply. But, Windows is abundantly adopted and its issues are also abundant. Part of the problem is there simply was no commercially viable alternative to Windows for a decent window of time. If you wanted to use an alternative OS from Windows in a business context to run point-of-sale operations or just have personal computers for employees doing non-technical work, you’d have to hire and maintain an IT infrastructure, including personnel, that were familiar with the alternative. And Microsoft’s monopoly position would make such a proposition expensive. More expensive than the cost of operating as a Windows shop? I’d argue, in the long run, yes. But, since when do managers and executives making decisions about fundamental software infrastructure think about the long-term? (Besides, in 10 years, the tech landscape will have changed anyway—we won’t still be using Windows then, right?)