26
votes
What have you learned from working in tech?
Question is for our users here who work/worked in the tech industry (in any capacity) or in a techy position in any industry.
What have you learned?
How did it change you?
Previous questions in series:
What have you learned from...
...being a parent?
...going through a breakup?
...moving to a new place?
These threads remain open, so feel free to comment on old ones if you have something to add!
Technology was a mistake.
I'll expand on this one a bit, because I fundementally agree.
Computers are a tool. But came a tipping point when most computers became just another vehicle for consumption. And it's really it's just the latest tip of the iceberg.
The story of Notel highlights quite well the adverse effect introducing just 1 TV station had on a town.
The passivity encouraged by consumption is depriving us of fuller lives. Even as I type this out, there's 100 other better uses of my time, but I feel compelled to finish the thought.
My parents circa the 90s weren't wrong about videogames and computers being timewasters, but they were wrong in the sense that they saw TV as an acceptable relaxation activity instead. At least computers have the capacity to do more than consume.
Put down your phone and walk out of the house. Drag your kids with you if you have them. Be bored in the front yard together and stare at the stars. Or the light pollution I guess. Think about how much it sucks that we spend so much electricity keeping the lights on for so many things while 99.999% of us sleep. And how many nocturnal creatures get messed up because of it.
Counter point, if we all drove our gas guzzler to a local park and lit a camp fire, we would more quickly destroy the environment.
It's not necessarily healthy for us, but wasting time on digital devices might be better for the environment than the alternatives?
How many gas guzzlers to mine resources, ship, smelt, ship, build, ship, assemble, and deliver our electronics and cars?
Before TV, neighbors would just go outside and talk to each other in the evenings.
We have learned many things from having computers. I think it would have been better for society as a whole if they were still treated more like particle accelerators and less like portable televisions.
That's the kind of life I want. I want to live in a community of small houses grouped around a communal garden with connected terraces. No driveways. (A pocket neighborhood?) In the evenings my neighbors and I would just chit chat, have occasional neighborly dinners with wine and play music and board games and talk literature and art. And my future children would play with theirs in the garden or build a treehouse in the woods nearby.
I have no need or desire for a giant 2-or-3-car garage, a lawn-as-a-moat, a giant living room to house a giant TV...
I don't know how we got to where we are, where so much of our lives is mediated by anti-human systems. Our cities are primarily experienced through wide roads and highways / automobile death canyons. Our private lives are primarily experienced through devices and the corporations that control the ecosystems they provide.
My ideal I think is something like a college condo:
Private family den, small kitchenette, bedrooms, and bathrooms
Shared between 3 and 5 families: large den, rec room, communal pantry/kitchen, workspaces/garages, yard.
Just for easy googling, the term for that type of complex is cottage court. There's a fair amount of them around where I live, and they are really cute.
It's not really an either or proposition. Lighting camp fires at local parks isn't the only thing you can do besides watching TV or browsing the internet.
That is kind of besides the point. Most entertainment involves either a gas guzzler or the internet. The same is increasingly true for employment.
As a recently-hired software dev: I needed to get a degree for this? I want to throw my phone and all of my electronics in the ocean
You do realize that while you claim you are trying to make the point that technology – specifically, computers and television – is responsible for these issues, you have spent most of your comment explaining how consumption is the cause?
Computers are a tool. That's exactly correct, and that's exactly why they aren't the source of the problem. Taken individually, I believe your points are correct; by focusing on consumption, they get far closer to the root issue. But that doesn't mean it's wise to blame a medium for the problems of the society that utilizes it.
No, I wasn't saying technology was responsible for these problems, I was saying technology was a mistake because of these problems.
That's an awfully slim difference, and does it honestly matter all that much? Nothing is a mistake just because it's abused; it only becomes such if abuse is all it can be expected to be good for, and technology isn't that. We've gotten countless wonderful things from computers, and video is a perfectly respectable medium for art and information itself.
The fact that people use technology as a means to hurt others or themselves is indicative of a societal problem, not that there's something inherently wrong with tech or the innovation that produces it.
Some say coming down from the trees was the mistake :)
Even the trees had been a bad move, no one should ever have left the oceans.
Let's just go back to Douglas Adams:
Back to Adams? We never left
Oh wow, my memory stopped at digital watches there.
I think we’d be best off cutting it back to the minimum needed to support medical advances.
I don't even think medical advances. Death is a part of life. 90-100 is plenty long for a human to live. We will never cure all ails or tragedies.
We'd be better off spending as much of our technological minimum in ensuring equal access to it. No more 'iphones for me and dysentary for thee.'
From there, target the advancements at early retirement. I'd rather live to be 80, retired at 40 than live to be 100 retired at 60.
Will more people die due to lack of further medical advancement? Probably. But more people currently die due to inequal access than due to lack of invention of newer and newer meds.
That seems dismissive of the lives of the disabled and all of the non-fatal problems people would have to needlessly suffer with.
Dismissive? Sort of, but it's a bit of a realism thing. When you're talking anti-growth, there is going to be some degree of suffering. Or even if your talking 'level off current energy usage' and distributing more equitably.
So yes. People with not-fatal, yet unsolved, discomfort can live with that discomfort so all the people whom do not currently have clean water, vaccinnes, and healthy food do.
We have passed peak oil. As we move forward, more of society's labor is going to need to be spent gathering energy than is currently. It's easy to say 'we can do both,' but as cheap energy comes to an end, we can't.
Sure, we can cut the easy targets first. Everything relating to advertising for starters. But someday a point comes: 60% of our labor goes to gathering energy for the other 40%. How do we allocate that 40%? Say food/water/housing/existing medicine takes up 30%.
You now have 10% of all remaining energy and labor available for 'beyond the basics'. Some needs to be allocated for insuring future basic needs are met. Call it 5%.
So you have 5% to allocate for everything else. Do we allocate it for endeavors for 95% of people or for endeavors for 0.1% of people?
If a 3 year old gets cancer, do we spend 3 lifetime's worth of energy and labor to maybe give them a 5% chance at survival and being further disabled, or do we let nature take its course? The current status quo means 'the billionaires kid lives and the poor kid dies'. Perhaps we just accept that cancer deaths in kids are a tragic, but excesive burden on society to attempt to cure. One that might be possible with access to unlimited energy, but but not on one constrained by it.
Let's talk about a more salient example: continuous glucose monitors and insulin pumps. These are not unsolved problems; they do not only affect a tiny minority. We know how to build CGMs and pumps, and there are 1.85 million type-1 diabetics in the US alone. T1D cannot be cured or prevented through lifestyle changes. Access to a CGM and insulin pump dramatically improves quality as life as well as extending it, and from a cold economic perspective, not ruining one's sense of touch through hourly finger-pricks and/or hypoglycemia attacks leads people to be much more productive.
And yet, CGMs and insulin pumps suffer from the absolute worse excesses of the computer and medical device industries. They are unreliable, vulnerable to obvious and simple attacks, brittle, proprietary, and ridiculously expensive; for instance, a Dexcom G6 applicator and sensor package lasts at most 30 days and costs about a thousand dollars, and requires a mobile app which only works on certain blessed phone models. Engineering effort directed in the right way is needed to solve these problems and, thereby, dramatically improve the lives of millions.
"No more 'iPhones for me and dysentary for thee'" is fine - but we don't solve that by getting rid of iPhones or their healthtech equivalents, we solve it by taxing the fuck out of everyone above the median and giving the fruits of that wealth to everyone else.
The worldwide GDP is about $104 trillion dollars. Equitably split across the world, that's $13,000 per person. That's how bad the wealth inequality is worldwide. Taxation alone isn't gonna fix it. An equitable world means everyone in the USA gets taxed to the poverty line.
Keep in mind that equitably splitting that income also means high-tech gadgets are going to be orders of magnitudes more expensive.
I'm not saying we throw everything in the trash...but new advancements will need to be lower tech than they have been.
And if anything...you've just kinda pointed out most of the problems with the device in question aren't due to a need for technological advancements...merely social ones.
Further... 1.8 million Type 1 diabetics in the USA could have a quality of life improvement. Over 2 million people in the USA are living without access to clean water. So literally in the present we have that iphone/dysentary problem in the US.
So yes, in the future world where energy becomes scarce, if I were put in a position where I had to choose...yea the diabetics live with the fingerprick until we figure out universal clean water. Especially since freshwater is becoming harder to come by.
Doesn't do us any good to build more things that rely on cheap electricity when that era is coming to an end.
The mean American family has a net worth of around $750,000; the median American family has a net worth of $120,000 [1]. In fact, wealth inequality in the US is worse now than it was 200 years ago -- and that's including the fact that a sixth of Americans were slaves [2]! So yes, taxation could go a long ways towards reducing inequality in America.
Fortunately these aren't mutually exclusive propositions, and certainly improvements in diabetes treatments are not coming at the expense of someone else's access to fresh water. Frankly, it's a bizarre comparison since there are many luxuries we could dispense with before we eliminate CGMs. I don't think you appreciate just how life-changing CGMs can be; a sudden change in a type 1 diabetic's blood glucose level can be lethal.
You're inventing a sure future that supports your position, when in reality the future is not sure, and ij particular this future - where society is meaningfully intact but energy is so scarce that we can't operate insulin pumps for a lack of it (???) - is exceedingly unlikely.
Among other things, there is enough uranium in accessible deposits to power the entire world for centuries. We could use taxation to build more renewables and storage. We can avert the worst.
Yes. It may surprise you, but I've actually thought a lot about this topic.
There is, but vast quantities of people seem to be unwilling to use them though. Perhaps with the end of cheap oil (and thus cheap solar/wind) that'll change. We're past peak energy in the current political climate where we're not building nuclear power plants.
This entire sub-thread was predicated on 'Cut back to the bare minimum to support medical advancements.' We're already in fantasy territory. That said, peak energy is a very real thing, and likely not as far away as many assume.
My discussion was a branch off complex technology being a bad idea. I generally hold that industrialization was probably a bad idea.
I think this is actually a very interesting point illustrating that even if inequality were absolutely removed from the equation, everyone would still have a not-great time.
The primary human-civilizational problems then are:
The fundemental lie of capitalism has been 'rising tide raises all boats.' On average, most people are still very, very poor.
It really does put into light the conditions of socialist countries. In a socialist country, it's likely everone is going to be poorer than above-average citizens in an equivalent not-socialist country. But the bottom rung should be much higher.
Medical advancements don't just extend life — they improve its quality. Sure, increasing access to medical care will help improve quality of life, but if you halt medical advancement people's lives will get worse long term.
Take long COVID for example. It's a serious, disabling condition, and unfortunately we don't know enough about it yet to treat it properly (in part because medical science has ignored post viral illnesses for decades). If we focused on getting people with long COVID better access to medical care and ignored researching it... There'd be no treatment to offer.
We also already have a dearth of new antibiotics, which is extremely worrying with all the antibiotic-resisistant bacteria strains out there. We could easily be barrelling towards a world where people die very early in life from currently treatable illnesses.
Or take pandemics and new infectious diseases in general, which are likely to increase in frequency due to climate change. No more medical research equals no more new vaccines and treatments.
The medical landscape isn't static, so if we stayed with the current repertoire of tests, treatments, and preventative measures, suffering would eventually increase, even if medical care was actually accessible to everyone.
In software engineering, there is a dearth of desire to test the new and reject it if it's not an improvement over the incumbent. Sometimes I think some people don't even think of that as an option. There's a lack of healthy skepticism. A lot of people don't seem to have an attitude that judges proposed tech on actual, measurable merit.
On the other hand, I sometimes see the same attitude hindering improvement, but from the other direction: Being too comfortable with inertia and status quo, and therefore being unwilling to consider or try new tech which could be an improvement.
It's the same problem either way: Some people don't take the time to compare alternatives. It's frustrating for those of us that do, because we see what we think is the superior alternative not being used or even considered.
Other things I've learned:
softwareweb development industry overall has become increasingly tolerant of complexity and difficulty. I'm not sure why, but one hypothesis is that newcomers to the industry just don't have the experience of what it was like before they came. i.e. how easy, simple, and straightforward some things wereThis is by no means specific to software development. These days I do various things to bits of metal and wood in a shed on my own for a living and even then half my job is dealing with people. Every job I've had has required significant interpersonal skills, because... people. They're always there. They're the client, they're your peers, they're your boss, they're the person who if you're nice to will fix your wobbly desk, etc. etc.
This is an excellent point. I used to track so much tech progress, but now I started leaning hard to not jumping on the latest thing. This is twofold: cynicism against tech as humanity's savior, and my recent tech background. I testing for an engineering software that's been around a long time. With hardware and analysis advancements, I'm on the cutting edge constantly at work. In my personal life, I want ultimate stability at home. I spend all day finding and reporting issues, so I don't want that to creep into my personal life.
For your point about development as art, I'll plug this article on Agile in Logic Mag. The exportation of agile is something that has been an observation I made entering the tech industry. It's a really useful tool, but I agree with the article that it ultimately comes down to the company's culture and intentions that sculpt how well agile works there.
I used to be on board with trying all the new tech, and I kind of still am, but I've seen it tried in practice enough times now that I understand why people can be hesitant (or just downright entrenched).
Any software project of appreciable size is an incredibly complex system. While some new tech may offer feature(s) that could alleviate bugs or tech debt, it is increasingly hard to determine what new side effects can occur. Even with some robust testing, prototyping, etc it cannot be determined with 100% certainty, at some point you have to make a judgement call and operate on faith.
And regardless of which way the decision goes, neither method was ever going to be perfect. So you end up with someone being partially vindicated, and someone being partially jaded, and both of them likely use that to reinforce their view of "never jump on the latest thing" or "always jump on the latest thing".
If there is any one thing I wish I could drill into new developers heads, it's this. Collaboration comes naturally to some people, for others it's something they'll have to work at their entire life. But I guarantee you it will be worth it. Collaborators help you when you're stuck, find those errors that are simple but take days to track down, and will have ideas that you will never have thought of.
Developers don't need to collaborate all the time (ie paired programming, ask me it's like doing that 8hr/day 5day/week for 4yr) but being able to sit down with another developer and rapidly iterate through ideas and problem solve is incredibly helpful.
I think software development is far artistic than most people think it is. Individually, people can create art by painting portraits by themselves, practice color theory, test out small experiments with mixed media, maybe even do some large murals. But a lot of software projects are the equivalent of painting the Sistine Chapel expanded to the size of the Superdome. Can one person do that? Theoretically. Way better to have some friends to help though, and that not only require some skilled
engineersartists, it requires communication, planning, coordination, consensus on styles and materials, and a long list of other things.That's a tad unfair; inertia and status quo come with some advantages that are hard to quantify. The fact that the technologies you're using have known qualities for stability and maintnenance are valuable in and of themselves, and trading them for something new comes with both risk and development costs.
Incidentally, I hate numeronyms with the energy of a thousand dying stars. I have no idea why programmers insist on using obfuscated terms. I think a11y is the worst one because it's intentionally doing the opposite of what the word means.
We’re building different apps than we were before. The problems arise when people use webpack, react, and node to run a tiny mostly static site. Unless it’s an educational pursuit, that’s overkill.
But most people are learning tools used to make Facebook, Figma, YouTube etc. I’m so glad things are where they are today. The tooling is good (enough). The underlying web standards are good. Depending on your stack it can be a lot or a little work to get a hello world going.
A lot of tech folks learn to be concerned about privacy. You have a much better idea of how "secure" your data might be, and you get a different insight into PR responses about certain things. When a company like facebook/google/twitter says that your data is secure, maybe they mean it - but they probably are talking about how things should be, but likely aren't checking with the engineers to know how things are. And smaller companies are even less likely to do their due-diligence. A large number of engineers at many companies can access production data with little controls/auditing. And while data can be "anonymized" and not have your name on it, you can learn a lot by looking at a single person's anonymous data. If someone really looks into you, its hard to hide. Many people haven't seen this simply because no one has singled them out, but if someone does focus on you specifically, they can likely learn a ton about you - including finding a bunch of passwords from previous breaches, insecure information that is often used for security questions, and enough information to impersonate you to customer service to gain access to your account(s). Security pro-tip - security questions are insecure and you should not answer them truthfully. Either use a password manager or figure out a scheme that is repeatable, but doesn't make sense based on the question. So perhaps any question about a maiden name would actually have a fruit as the answer...either all of them are "Banana", or you can go further and do something like the fruit with the same starting letter of the company. So Google would be Grapefruit, Twitter would be Tangerine, and Apple would be...Apple.
Anyways - a lot of tech folks who see that will either get privacy-conscious (avoid social media, use duckduckgo instead of google, scriptblockers, adblockers, linux, etc) or will "give up" and assume that all their information is out there.
I think it’s safe to assume that for most companies and even most technology companies, security is pretty bad. Most random hardware devices you buy on Amazon are likely to be insecure.
I think Google is an outlier, though, in the sense that they invent new security technologies to lock things down even more and often actually deploy them. It hardly seems fair to lump them in with Twitter, which we’ve learned from a former head of security is pretty bad internally.
(I’m mostly in the “give up” stage on privacy, due to having posted so much with my real name. Security and privacy are different.)
They might be better as far as security (twitter is not setting a high bar tbf), but they are also the most prolific when it comes to data collection, which results in other problems...like google having the ability to lock your account for any reason (or no reason at all), which would cripple many people's online presence completely. Theres been some threads about "degoogling", and from what I can tell - it seems like the answer is "go all in on apple's ecosystem" or become a reclusive hermit.
In my experience, degoogling isn't that hard if you only aim to approach 100% purity, and not try to actually achieve it. I don't search with Google, don't use GMail for mail, and avoid Google Docs and Google Drive unless someone shares something with me. I don't use Google Maps, and don't use Google Analytics for my websites. I have an Android phone, but don't use any Google stuff there, either, other than the things hardwired into the OS.
Yes, getting locked out of your account because "computer says no" is the one threat that can't be fixed. I don't know how to judge the probability of that happening for Google, but I've seen enough stories that I'm wary. (Although, the problem with judging the stories of strangers is that you never know how shady that person might be.)
In theory this could happen for any account. It's not out of the question that Apple could do it for iCloud. Also, banks can and do tell people that they don't want them as customers anymore, or they might decide not to create an account at all.
I think it would be sensible to back up everything. Google doesn't make it easy to do this on an ongoing basis. There is "Google Takeout" where you can do it periodically.
Project Managers are such a mixed bag. You get the good ones that engage with their developers, try to get them the resources they need and are quick to stand up to "I want it yesterday" clients and product owners. Then you get the bad ones that just show up to the stand up to call out names and create tickets.
A lot of things are mixed bags, I guess, but this is fresh in my mind. I recently started a new job and we had a really good PM on our project. Just made stand ups a lot of fun while also doing his job really effectively. Two months into my job he left and his replacement is just...calling out names during stand up. In their defense, there's some ramping up to be done on their part, but the contrast is so stark. We had a developer yesterday talking about how exasperated they were with trying to figure out a ticket. The new PM's response was "Okay, <next person's name>". Like NO! Our old PM would've held up the whole damn meeting until he found someone to help unblock that developer. Or at least been like "hey let's chat afterwards and we'll find you someone". That's not something that requires intimate knowledge of the project or ramping up to do. That's just basic Project Management, imo.
I've had a variety of PMs, and a good PM is worth their weight in gold. I'm now in a position to help mold and shape fledgling PMs (student interns) and I'm doing my damndest to make sure they understand the differences between trash PM, good PM, and amazing PM.
On behalf of all devs everywhere, thank you so much!
Homework: Watch Office Space, write a report on why Lumburg is a horrible boss
Next week : Silicon Valley
There's probably something you trust computers to do. Not everything obviously; but there's probably at least one thing that you're pretty sure computers have well in hand.
This is a mistake. Computers do not do that thing reliably, and you should not trust them to. The fact that they've done so often enough for you to believe that they do is at best coincidence, but more likely a ruse to lure you into complacency. Computers fail sometimes at effectively every single task we have ever set for them¹; all software is garbage written by incompetent people under bad management and all hardware breaks all the time.
¹ I'm phrasing this in a way that blames computers themselves, but obviously people write the software that computers run (as well as designing the devices themselves), so ultimately it's all people's fault. And, like everything people do, it's more the fault of the society and power structures in which people operate than that of any specific engineer or programmer.
Here's a tangential, very philosophical thought. Software engineers are not paid to write code. Rather, software engineers are paid to make business requirements rigorous enough that computers can execute them. (The typical mechanism by which this is done, of course, is by writing code; but the code is a means, not an end.) This is why (1) software engineers and businesspeople seem to clash so frequently: functionally, the engineers are essentially copyeditors for the businesspeople's ideas, which means pointing out to a lot of fairly self-important people the ways in which their thoughts are fundamentally incoherent; and (2) why software engineers have proven so difficult to automate or replace: no amount of tooling which make it easier for businesspeople to create software without having to learn arcane syntaxes or complex compsci theory will actually enable them to create software, because to be frank, the syntax and theory and math are the easy parts of being a software engineer.
The tech industry is difficult for people with fragile egos. New additions to any organization are assumed to be stupid until proven otherwise, and treated accordingly. This often manifests itself as condescension or patronization.
Some people simply don't have the emotional sensitivities to care and are able to push through it. Others such as myself are extremely sensitive to this, and dealing with it becomes exhausting.
Would you say that's more true of tech than other industries?
Do you think the people with titanium egos are likely to act the same way once they've passed the condescension threshold themselves? Like is the toxic environment self-sustaining because of the people, or the shape of the organizations, or something else?
Given that I haven't significantly worked in other industries, that's hard to say. I don't think it's exclusive to tech, but I do think it is probably more common in tech than many industries.
Yeah, I think that fundamentally it comes down to the people the industry attracts. When I describe working in tech to people who don't, here's what I always say:
"Think back to your years in school, and that one socially-awkward smart kid who had to be right about everything, point out everybody's mistakes, and cannot read the room. Now imagine drawing upon all of those smart kids from schools around the world, throwing them onto a team, and telling them that now they all have to work together."
I have no doubt that there are other careers that also have this issue (medicine, maybe?), but I also have no doubt that there is a concentration in tech.
Mostly that I don't want to work in tech. The intellectual challenge is fun but using a keyboard all day long is bad for anyone but it turns out that it was especially bad for my body. Which I realised quite a few years too late, but there you go.
Also over fifteen years in the industry I think I only worked with three people I genuinely considered to be excellent programmers. Only maybe one I considered great. The amount of stuff which is cobbled together by people who barely know what they're doing is terrifying/amazing, although that's not really just a tech thing. That's everything.
It feels like a trap sometimes. You get into this high-paying field and you spend all this time shoring up those skills so you can advance...but if you want to leave where are you going to go? All your skills are tech, your whole career has been tech. You can't always easily find a similarly paying job without spending a ton of time/money on education to jump fields. Very frustrating.
I don’t know - tech seems like it’s got some good avenues out. If you want to run a business you can start one with your tech skills but migrate to leadership if you’re successful. And if you have any useful non-tech skills you can probably work in that domain, but add in programming or whatever. Now at least you’re programming for a different field.
When you're a kid you can have ideological reasons not to work on weapons systems, but that means you end up working on things like coal mining technology. I look back at the stuff I built for Eurofighter Typhoon or Joint Strike Fighter, and the stuff I built for Airbus or Joy Mining, and the civilian stuff has killed very many more people than the military stuff. Like, orders of magnitude more.
You have worked on some interesting projects.
I worked for sub-contract electronic engineers. We built other people's products for them. But for the aerospace stuff we didn't build anything that went in the air, we only built test equipment. And for the military stuff we built test equipment that was used to build test equipment that was used to build the planes. It was interesting because I got to learn about things like ARINC 429 (which is a communications protocol for commercial aircraft) and when you see robust levels of redundancy it's just nice to see.
The coal mining stuff was a bit different because we were building the actual product. There's a type of mining called "longwall" mining. There's a short explainer video here: https://www.youtube.com/watch?v=bXORrVmxwbM
At 1:56 they show a roof support. We called those chocks. And the chocks are all interfaced to each other, they all talk to each other, so the operator can move them all forward when needed. The box the operator uses, that interfaces them all together, is called a chock interface unit. And that's what we built.
In this video you can see one of the chocks moving forward: https://youtu.be/WmwEB4DY_jc?t=62
And then here, over his shoulder with a blue screen, you can see a chock interface unit. https://youtu.be/WmwEB4DY_jc?t=125
The coal mining stuff was interesting because there's a solid safety engineering background to it. Just huge amounts of thought went into keeping it safe. It was nice to see something done properly.
Holy hell. Say about coal and carbon emissions what you will (I'm with you) but that mining process is seriously cool. They have a 1km long roofed shack that essentially eats through the coal deposit. The amount of mechanisation there is insane. And the capital investment too, I'd presume.
Between the safety engineering there and in aviation, I'm convinced I'm in the wrong field. There's no such thing as reasonable software engineering practices in AI. I mean, we're all aware of our social responsibility here, as far as bias, discrimination and potential for abuse goes, and I have faith in the related components. But otherwise the engineering quality of the work is generally shit. I mean, there's lots of people who bemoan that, but the inertia of the field is such that there's no change in sight.
Software development (and technological development), particularly at higher levels of abstraction, feels like creation when in reality it is decisionmaking. At the beginning of a project (or startup), the world is a superposition of possibility, but every new piece of software, be it written from scratch or adopted from an external source, is a quantum entanglement that collapses your future possibilities. Interfaces get adopted, data formats become entrenched, conventions get established--these processes are not necessarily one-way but you will either pay back the initial time spent manyfold when circumstances change, or the whole endeavor collapses under its own weight. Better decisionmaking up front can quicken or prolong that process, but it is inevitable so long as your software lives in a changing environment (not that that's always the case, see the 20 year old machines running Win95 in airgapped installations). Part of my despondence with the tech industry is that the entrenchment of giant companies means that this exponential investment in prolonging can go on for MUCH longer (at much greater expense), where in the 90s or 2000s we'd just see companies replaced by other companies that could start with better-informed groundwork.
Corollary to the above: the hardest software development projects are rarely hard due to a technical challenge, but rather because people with decisionmaking authority are unable or unwilling to make decisions regarding their business or processes. Manual processes can accommodate significant slop in dissonant or outright conflicting decisions, by individual humans exercising judgment and massaging things at the edges (see, for example, how the healthcare system runs on fax machines because EHR systems just can't get along). Automating those processes, likely because the human fudge-factor is expensive, means trying to broker a peace between two or more warring conceptual factions. It's satisfying when it works, but it's a hard path to follow.
Software architecture is a hybrid of science fiction and journalism. Your success or failure stems from your ability to visualize and worldbuild in a (near) future fiction, and then bring it back to the present and document it clearly--that might be documents, that might be screenshots, that might be prototypes--such that you can get everyone involved living in that same future world. The further people (or teams') conception of that future world drift apart, the more problems you will have.
The only thing I like more than programming is not programming. All the best software engineers are lazy people who never want to write the same thing more than once.
All tools/technologies, at all different levels of capability and power, will both eventually be used for incredibly good things and incredibly bad things. I suppose this is true for more than just technology.
I constantly find it difficult to reconcile this at an emotional level. I suppose it's all about balance and perspective. The bad will always come with the good and we just do what we can to mitigate it
Sorry for sounding so generic