I think it's only a matter of time until another major CEO gets assassinated. Anti-capitalist sentiment has been rapidly growing over the years. Just a few days ago, a disgruntled worker in...
I think it's only a matter of time until another major CEO gets assassinated.
Leading a company in an industry that strives to replace workers with robots and drive mass unemployment is a sure-fire way to be deeply unpopular, especially when their antics have skyrocketed RAM prices and done serious harm to the consumer electronics market.
While I don’t advocate for violence, I think America Is overdue for a socialist revival. Probably the rest of the world too. The world is not in a good place right now.
While I don’t advocate for violence, I think America Is overdue for a socialist revival. Probably the rest of the world too. The world is not in a good place right now.
IMO that's the opposite of advocating for violence. If people aren't given a legitimate political avenue to meet their needs, they will just act outside the political system. A socialist revival...
IMO that's the opposite of advocating for violence. If people aren't given a legitimate political avenue to meet their needs, they will just act outside the political system. A socialist revival would prevent the kind of desperation that leads to crazy acts like this.
Yeah I think we need to redefine what "violence" is in these dynamics and define a new term to replace it: "systemic self defense." Because "violence" is not always of the simplistic definition...
Yeah I think we need to redefine what "violence" is in these dynamics and define a new term to replace it: "systemic self defense."
Because "violence" is not always of the simplistic definition "human hit other human," it's systemic violence like threatening jobs and livelihoods of people who will now have to struggle to survive because of those decisions. It's creating a healthcare industry that thrives when it denies lifesaving care of it's members.
I think reactions to that kind of systemic violence after being ignored in the proper systemic avenues for change shouldn't be considered violence, it's systemic self defense.
Just like with more physical violence, we shouldn't blame the victim for defending themselves, and people shouldn't be accused of "condoning violence" for supporting someone who defended themselves either. And once again, that same concept should be applied to systemic violence as well.
I completely disagree. I could get on board with the notion that a government policy threatening jobs and livelihoods constituted systemic violence, because the government inherently represents a...
I completely disagree. I could get on board with the notion that a government policy threatening jobs and livelihoods constituted systemic violence, because the government inherently represents a threat of force. But accusing a CEO of systemic violence because his company produces a product that might lead to job loss is provocative and dangerous. Moreso because of the subsequent argument that people shouldn't be blamed for defending themselves against such "violence."
Is John Deere committing systemic violence by reducing the job market for manual farm workers? Did Apple commit systemic violence when the first Macintosh decimated the job market for manual ledger-keepers? Is a layoff because a company is going under systemic violence? Is it when I get fired for poor performance? What if I'm getting fired from the only job I can reasonably get and now my family will starve?
The definition of violence must involve force or the threat thereof. It cannot just be threatening jobs by creating technological advances. Not just to remain logically coherent, but to prevent the definition from being turned against anything and everything that we happen to disagree with.
There's a lot more nuance that needs to go into discussions of systemic violence. I'll try to go point by point. So the nuance here is that there’s active lobbying happening from AI companies...
There's a lot more nuance that needs to go into discussions of systemic violence.
I'll try to go point by point.
But accusing a CEO of systemic violence because his company produces a product that might lead to job loss is provocative and dangerous.
So the nuance here is that there’s active lobbying happening from AI companies right now. Companies like Google and Meta are influencing policy, but not necessarily toward programs that meaningfully offset displacement. That context matters because policy outcomes don’t exist in a vacuum, they’re shaped by those pressures.
They're creating technologies that cause problems without advocating for any solutions.
Not even just on an ethical ground, but even an economic one too. Because if that many people get displaced then how will those people add to the economy and afford to buy the products of the companies that used AI to save money? That's another possible hard limit to this entire "infinite growth/profit" that can be reached if people aren't thinking about it.
That doesn't mean we shouldn't also criticize the government and policy makers, but all these systems are interconnected due to the way our society is currently structured.
Granted, /u/skybrian in their responses have pointed me towards articles that might disprove or alter that opinion, and I have yet to do enough research into it to officially change my opinion. It very well could change given this new information, but I'm going to respond from where I'm at now. I don't yet know if this "call" is just an interview soundbyte or if it's an actual program they're actively lobbying and advocating for.
Is John Deere committing systemic violence by reducing the job market for manual farm workers? Did Apple commit systemic violence when the first Macintosh decimated the job market for manual ledger-keepers? Is a layoff because a company is going under systemic violence? Is it when I get fired for poor performance? What if I'm getting fired from the only job I can reasonably get and now my family will starve?
Haha Actually yes to absolutely every single example you gave, everything you just listed is the exact example of systemic violence, it's just accepted as a normal part of a capitalist system.
Anything systemic that threatens the survival of an individual is systemic violence.
When you come at this issue from the angle that every person deserves a baseline level of means to survive, so that the idea of being jobless isn't a death sentence to many people(via losing health insurance, not being able to afford medication, food insecurity, mental health issues, etc), then anything in a system that threatens an individual's survival is violence against them.
The definition of violence must involve force or the threat thereof. It cannot just be threatening jobs by creating technological advances. Not just to remain logically coherent, but to prevent the definition from being turned against anything and everything that we happen to disagree with.
I mean at that point it's just semantics. Systemic violence is just the accepted term for what we're talking about, which as a concept does exist. I don't define these things, I just use the commonly accepted term for the concept I'm describing.
With that being said, as a concept Systemic/Structural violence does exist, so I'm not really the one who holds the keys to the definition, so my hands are tied. Unfortunately it seems that you're two years too late to take up that semantic debate with Johan Galtung who coined the phrase.
Then the term is so broad as to be completely meaningless and discussing it is useless, no? In any event I again reject the concept that violence can occur without force. Respectfully, this is...
Haha Actually yes to absolutely every single example you gave, everything you just listed is the exact example of systemic violence, it's just accepted as a normal part of a capitalist system.
Then the term is so broad as to be completely meaningless and discussing it is useless, no? In any event I again reject the concept that violence can occur without force.
I don't define these things, I just use the commonly accepted term for the concept I'm describing.
With that being said, as a concept Systemic/Structural violence does exist, so I'm not really the one who holds the keys to the definition, so my hands are tied. Unfortunately it seems that you're two years too late to take up that semantic debate with Johan Galtung.
Respectfully, this is pretty silly. You invoked the term. If I brought up psychoanalysis and you said "that's an absurd idea," it'd be rather ridiculous of me to go "hey, it's a concept that exists, my hands are tied, take it up with Freud, too bad he's dead" as if it was some sort of counterargument.
I mean you can not like the practice of Psychoanalysis, but if your objection is that you're not psycho so the language doesn't make sense then "Talk with Freud about it" is not an unreasonable...
I mean you can not like the practice of Psychoanalysis, but if your objection is that you're not psycho so the language doesn't make sense then "Talk with Freud about it" is not an unreasonable response if it remains a snarky one.
If someone is using an operationalized definition, telling them the name is stupid doesn't really argue against anything other than the name. If you're objecting to the concept more broadly then I think it makes sense to focus less on the semantics.
IMO the term seems well defined, it's just like many academic terms intended to be considered in those spaces and with that underpinning of theory.
Layoffs cause harm - layoffs because shareholders want to make more money and so employees who used to be able to expect a stable living and a pension or at least a severance package show up to the doors being locked and unemployment being over a week away.... How is this not systemic harm? Because that harm didn't break bones is it non violent?
This is a struggle I have with the term violence vs harm myself but if defined for the purposes of a conversation (such as with an academic term), it's not productive to focus on the definition IMO.
I’m not sure which term you're calling "too broad", "violence" generally, or "structural violence" specifically. Because structural violence has a defined meaning. It’s not just "anything bad", it...
Then the term is so broad as to be completely meaningless and discussing it is useless, no? In any event I again reject the concept that violence can occur without force.
I’m not sure which term you're calling "too broad", "violence" generally, or "structural violence" specifically.
Because structural violence has a defined meaning. It’s not just "anything bad", it refers to systems that produce harm by limiting access to basic needs.
If you’re rejecting the idea that violence can exist without direct force, then you’re rejecting that broader definition entirely. That’s fine, but then the disagreement is about the definition of the word "violence" and not whether the concept is meaningless.
Also, out of curiosity, would you also consider emotional abuse not to be a form of violence just because force isn't used? Would you disagree that the term "domestic violence" encompasses the concept of emotional abuse?
Respectfully, this is pretty silly. You invoked the term. If I brought up psychoanalysis and you said "that's an absurd idea," it'd be rather ridiculous of me to go "hey, it's a concept that exists, my hands are tied, take it up with Freud, too bad he's dead" as if it was some sort of counterargument.
I'm just saying you’re getting hung up on the word "violence" and it's technical definition, and I don't have any intentions to debate the definition of a word when I'm talking about the concept the term represents.
Because if the word "violence" is the sticking point here, we can set it aside and call it whatever you want and move on. We can invent a completely new term and call it "Structural Komblartamy" from here on out if that helps us move forward.
But we’d still be talking about the same thing: a system where losing a job can mean losing healthcare, where access to survival resources is tied to money, which is tied to employment, and where that creates predictable harm for people who can't realistically opt out of that system when jobs are removed.
I get where you're coming from and this is a reasonable argument, but I disagree. This is going to be a very tangled argument because I recognize you don't want to debate definitions when it's the...
Exemplary
If you’re rejecting the idea that violence can exist without direct force, then you’re rejecting that broader definition entirely. That’s fine, but then the disagreement is about the definition of the word "violence" and not whether the concept is meaningless.
I get where you're coming from and this is a reasonable argument, but I disagree. This is going to be a very tangled argument because I recognize you don't want to debate definitions when it's the concept that's important, but the definition is integral to the concept. If you bear with me, I'll try to prove it. After that, I'll discuss the concept of systemic issues more broadly.
The use of the word violence is key to both the meaningfulness of the concept itself and the way you're using it. And I think the way you're using it demonstrates my point, actually. To expand, you wrote
I think reactions to that kind of systemic violence after being ignored in the proper systemic avenues for change shouldn't be considered violence, it's systemic self defense.
Just like with more physical violence, we shouldn't blame the victim for defending themselves, and people shouldn't be accused of "condoning violence" for supporting someone who defended themselves either. And once again, that same concept should be applied to systemic violence as well.
None of this really makes sense if we don't take structural violence to be a form of violence as commonly understood (e.g. with the threat of force and all that). And clearly you do and that's where we disagree, but my point here is that the categorization as violence is inherent to everything that we're discussing. Similarly, there's a reason that Galtung didn't call the concept structural komblartamy in the first place, right? He chose the word violence specifically to evoke the normal definition of the word violence. I mean, to quote his original work explicitly,
Everything now hinges on making a definition of 'violence'.
So... yeah. : )
This matters because of exactly what I disagreed with in the first place. Violence begets violence. Violence justifies violence. Words and ideas have power, a lot of power, and categorizing things ranging from job loss to dying of tuberculosis as violence justifies a violent - i.e. forceful - reaction to them. Which was your point in the first place, no? That a violent reaction would only be self-defense?
Put simply, I disagree. Vehemently.
Poverty is bad. People losing their jobs is bad. People dying of tuberculosis (I mention it again because that was one of Galtung's examples) is bad. Our civilization produces these things, yes. All of them should be worked against. But how we conceive of them - the causes we observe, the labels we apply - defines how we combat them. If we think of them as structural ills, we create new policies to reduce them. If we think of them as violence, we shoot CEOs in the street. Or, in this case, try to burn down their homes.
All of those things should be worked against. But a violent response to them is neither morally justified nor practically effective.
Also, out of curiosity, would you also consider emotional abuse not to be a form of violence just because force isn't used? Would you disagree that the term "domestic violence" encompasses the concept of emotional abuse?
For the former, no, I wouldn't. That doesn't mean emotional abuse isn't bad, obviously, but I think the word abuse characterizes what is happening. For the latter, the US federal definition of domestic violence, which I think is pretty good, does encompass emotional abuse, but - and this is the important part - it must be part of a pattern intended to control or intimidate the victim. Logically it follows that emotional abuse is not in and of itself sufficient to constitute violence; it's the pattern of intimidation and control that constitutes violence. That's a little broader than I would go - I think the use or threat of force is integral - but it's closer to my definition than not.
Probably my biggest issue with the concept of systemic violence. If I’m having systemic violence perpetrated against me, does that mean I can employ self-defense against said system? Against...
Violence justifies violence
Probably my biggest issue with the concept of systemic violence. If I’m having systemic violence perpetrated against me, does that mean I can employ self-defense against said system? Against people in said system? Or is the argument going to be that said resistance must be collective and that individual resistance is wrong?
Devil's advocate: "AI" CEOs like Altman are actively taking away jobs, which effectively means people's ability to feed and clothe and house themselves, and any meaningful access to healthcare. Is...
Devil's advocate: "AI" CEOs like Altman are actively taking away jobs, which effectively means people's ability to feed and clothe and house themselves, and any meaningful access to healthcare. Is that not just violence with extra steps? Sure, Sam isn't locking me in a room without food and water. But the effective output of his company accomplishes the same thing.
Any product that does something useful will take away people’s job. That’s almost definitional. VisiCalc’s release as the first spreadsheet software entirely destroyed an entire field of...
Any product that does something useful will take away people’s job. That’s almost definitional.
VisiCalc’s release as the first spreadsheet software entirely destroyed an entire field of accountants. There are still accountants today, of course, but they do something entirely different.
I don't think that definition is even loosely true. I can think of literally hundreds of products that don't take away someone's job (bicycles, food, shoes, toothbrushes, podcasts, video games, my...
I don't think that definition is even loosely true. I can think of literally hundreds of products that don't take away someone's job (bicycles, food, shoes, toothbrushes, podcasts, video games, my favorite text editor, my favorite IDE, etc etc etc). I guess that definition holds if you assume a zero-sum world where no product can create new opportunities and abilities that didn't exist before?
The difference this time is the order of magnitude of jobs AI boosters are threatening to take away. Literally all white collar work (including the jobs of the AI boosters, mind you) is fair game. If AI can do varied white collar jobs, it can likely also manage itself with vanishingly few humans in the loop. I'm not sure any technical innovation has threatened a gun to the head of so many careers simultaneously before; even the industrial revolution typically replaced one job at a time with a dedicated machine like the loom, sewing machine, or the tractor.
Growing food more efficiently can certainly result in jobs being lost in agriculture. Who do you think did the work before farmers had giant machines to do it? Considering that at one time nearly...
Growing food more efficiently can certainly result in jobs being lost in agriculture. Who do you think did the work before farmers had giant machines to do it? Considering that at one time nearly all people were peasant farmers, it's hard to see how AI could get rid of more jobs than the Industrial Revolution.
A similar argument could be made for basically everything we buy.
Of course, often people end up working different jobs and that's usually considered progress.
Using AI to automate tasks doesn't seem all that different in kind. It might be different because it happens a lot faster. It still seems too soon to say if that's really going to happen, but it has a lot of people worried. What would the new jobs be? Who would be qualified to do them?
I very much agree with this thought. I'm honestly quite excited to see the possibility of that AI could do to change the way of working. Maybe it'll really not have much of an impact and that's...
I very much agree with this thought. I'm honestly quite excited to see the possibility of that AI could do to change the way of working. Maybe it'll really not have much of an impact and that's not a big deal, that's fine.
Maybe it'll have a much larger impact and we'll see white collar work change in a large way.
But yes, this has happened so many times before to humanity, we've had huge shifts in how we work and we always find new things to do.
I already responded to you up higher, and maybe I should wait for a response before responding to something else you said, but I did want to address this. So this is a very VERY capitalistic view...
What would the new jobs be? Who would be qualified to do them?
I already responded to you up higher, and maybe I should wait for a response before responding to something else you said, but I did want to address this.
So this is a very VERY capitalistic view on the situation. Because who says there needs to be new jobs?
Maybe I'm crazy, but I've always believed that capitalism is unsustainable, but that it was neccesary as a means to an end. Because true capitalism requires infinite growth and market value as drivers of progress. But there's always going to be a ceiling for that system, and growth is not infinite. Like the concept of inflation is managable when you're looking at it in the span of a dozen years, but zoom out decades based on ideal growth and inflation and what are we looking at? Zoom out to 2080, are we looking at $500 gas and minimum wage being set to $350/hr at that point?
Capitalism was a great motivator and pusher of progress to get us to the point of technological and societal progress that every human on the planet can be afforded a decent living without overt struggle.
We're at that point. And AI can absolutely help with that, but AI and Capitalism is a mixture that won't end well for the average person.
Way back before it was called AI, when I just called it automation, and I saw the writing on the wall the moment they added self-checkouts to grocery stores. I was advocating for the idea of taxing corporations that use automation in a way that displaces human workers, and to use that tax revenue to fund a type of universal basic income for those who get displaced as well as for training programs and education for different trades that are needed. A UBI that let a displaced person live off of without going hungry or losing their homes, but also motivate them to find work or training in other needed sectors in order to make more money than UBI gave them.
The idea of the tax is that it also leveled the playing field a bit in a way that would offset the financial benefits of replacing humans and instead turn it into efficiency benefits instead. Advocating for a surgical use of automation instead of mass layoffs on a whim.
And eventually maybe people being displaced could have focused on pursuing education or personal enlightenment. But unfortunately under capitalism the only ones who get to do that are the CEOs, Shareholders, political elite, and billionaires.
Unfortunately none of that happened and here we are.
I’m in favor of UBI. But regarding new jobs, the question is whether there is more work that could be done if there were people were willing to pay for it and the systems in place to make that...
I’m in favor of UBI. But regarding new jobs, the question is whether there is more work that could be done if there were people were willing to pay for it and the systems in place to make that work viable.
When people aren’t worrying about AI, sometimes they worry about demographics - more retirees to take care of and fewer workers to support them. Maybe that’s what productivity improvements get “spent” on? In some countries, anyway.
I’ve come to think these days that the problem is not the lack of workers, but with the ideas driving society itself. In my grandmother’s final years she had a number of nurses. Many of them were...
When people aren’t worrying about AI, sometimes they worry about demographics - more retirees to take care of and fewer workers to support them. Maybe that’s what productivity improvements get “spent” on? In some countries, anyway.
I’ve come to think these days that the problem is not the lack of workers, but with the ideas driving society itself.
In my grandmother’s final years she had a number of nurses. Many of them were Filipino, because they were “naturally caring”. Uncomfortable racialism aside, it assumes the question of what a caring society looks like. Why it might be a problem that needs more than labor to take care of. If we did have a labor-independent society, maybe family would be free to stop working their jobs and take care of their elders. Maybe the volunteers who are currently helping elders will be free to take more direct actions to help out. I don’t think we would have to worry as much about elder care in terms of paid labor.
The trouble is that the alternative to paid labor is unpaid labor. There are situations when volunteers can play a role, but it would be unethical to expect nurses to go unpaid. Traditionally,...
The trouble is that the alternative to paid labor is unpaid labor. There are situations when volunteers can play a role, but it would be unethical to expect nurses to go unpaid.
Traditionally, people relied on unpaid labor by family members. And I'm living that because my mother came to stay with us for the winter and I expect to do more of that. It works for us.
Except, it's not going to work for my wife and I because we don't have kids. It would be both unworkable and unethical to attempt to rely on volunteers when we get old.
I do hope we will be able to find caring people to help us, but I also expect to pay them.
Those each are alternatives to different professions. Stagecoaches, farmers (from efficiency gains), dentists, 🎵TV killed the radio stars, board games, punch card maintainers, programmers who...
bicycles, food, shoes, toothbrushes, podcasts, video games, my favorite text editor, my favorite IDE
Those each are alternatives to different professions. Stagecoaches, farmers (from efficiency gains), dentists, 🎵TV killed the radio stars, board games, punch card maintainers, programmers who write assembly.
Debateable. Are toothbrushes really an alternative to dentistry? I suppose in the medieval sense of dentists as "people who remove rotten teeth", but if anything I would think modern dental care...
Debateable. Are toothbrushes really an alternative to dentistry? I suppose in the medieval sense of dentists as "people who remove rotten teeth", but if anything I would think modern dental care actually brings more business to dentists through crowns, fillings, and cleanings?
I think the idea that innovations inevitably displace existing industry is a tad reductionist, is all.
You still need to see a dentist regularly and may need them regardless of how well you take care of your teeth, because shit happens. That's a bad example IMO.
You still need to see a dentist regularly and may need them regardless of how well you take care of your teeth, because shit happens.
I used to get a lot of fillings due to poor oral hygiene, maybe one or two a year. This meant longer sessions at the dentist or sometimes a separate appointment to perform the extra work. Then I...
I used to get a lot of fillings due to poor oral hygiene, maybe one or two a year. This meant longer sessions at the dentist or sometimes a separate appointment to perform the extra work. Then I got gingivitis, which scared me into consistently brushing and flossing ever since. Haven't needed a new filling in almost 20 years. My dentist has even suggested that I don't actually need to go to them twice a year since they usually don't find much. Although I still go to the dentist twice a year, fillings and other dental work would be a significant addition to the typical maintenance work.
Toothbrushes obviously aren't an alternative to ALL dentistry, but I'm sure that the fact that toothbrushes are cheap and widely available means less work for dentists in aggregate. If we didn't have toothbrushes we'd need more dentists. That's just in a vacuum though. I think the number of dentists in reality is more influenced by economics and culture than any measure of overall dental health in the population.
Maybe, but toothbrushes are older than modern dentistry (although ancient dentistry and ancient tooth care tools like chew sticks are both thousands of years old.). I don't think it's a good...
Maybe, but toothbrushes are older than modern dentistry (although ancient dentistry and ancient tooth care tools like chew sticks are both thousands of years old.). I don't think it's a good example of an invention that took jobs given that very long and complex global history. And even if it's just thinking about western tooth care you're looking at over 400 years of toothbrushes.
Who’s to say that AI won’t create new jobs and opportunities? Excel turned accountants from people who manually did calculations in a book to people who wrangle excel. To be clear, I don’t think...
I guess that definition holds if you assume a zero-sum world where no product can create new opportunities and abilities that didn't exist before?
Who’s to say that AI won’t create new jobs and opportunities? Excel turned accountants from people who manually did calculations in a book to people who wrangle excel.
To be clear, I don’t think there’ll be a lot of ChatGPT wrangling jobs, but that’s because I don’t think it’s particularly useful outside of coding assurance, and won’t disrupt many jobs in the long term.
That's a good point. Right now the AI boosters aren't doing a good job of showing where these "gaps" in AI ability might actually exist, mostly because they're optimists who refuse to acknowledge...
That's a good point. Right now the AI boosters aren't doing a good job of showing where these "gaps" in AI ability might actually exist, mostly because they're optimists who refuse to acknowledge the shortcomings of AI. IMO LLMs aren't likely to do much autonomous work, but I'm definitely not excited at the prospect of a gig where I babysit and fact-check a dozen or more LLMs who do my old job worse than me.
If LLMs are to disrupt industry in a way that reflects the amount of money being invested, you won't have to fact check them. They're not very useful in their current limited, naive implementation...
If LLMs are to disrupt industry in a way that reflects the amount of money being invested, you won't have to fact check them.
They're not very useful in their current limited, naive implementation states. They're useful in custom built agentic workflows where they're able to fact check themselves though.
My personal experience is that they have gotten better, but improvements have become incremental. If I had to guess, I think the technology will land somewhere between the confident bullshit generator we have now, and the AGI utopian future all of the leaders of these companies are promising.
That means that the new jobs that best adapt to this technology are going to be people who are skilled at building these agentic workflows; curating proprietary data and tying them to RAGs, and having multiple models which run verification steps and build tests on the fly to overcome their bullshit tendencies.
I imagine that a team inside a company that can find problems throughout the business, and solve those problems much more rapidly than a traditional internal dev team could using low code AI agent platforms would be really valuable for a lot of businesses.
The actual devs will continue doing actual dev stuff and won't have to worry about new project requests to build some crud app for some random HR team or whatever anymore.
Yeah, I've been using Google Antigravity for a few projects, and I'm impressed. It's significantly better than trying to code by just having a conversation with a chatbot and copy pasting in and...
Yeah, I've been using Google Antigravity for a few projects, and I'm impressed. It's significantly better than trying to code by just having a conversation with a chatbot and copy pasting in and out. At its core, it's not that much different from a manual chatbot approach, but the ease of use and automation of gruntwork feels significant to me. It's not a magic bullet. I still have to iterate on the roadmap, then iterate on the design docs for each element on the roadmap, then iterate on the implementation plans for each element on the design docs, then iterate on corrections for shifting goals or things that went wrong. However, you can get to some good results on platforms/languages you don't know just by knowing fundamentals of programming and software engineering. I made a Chrome extension recently that seems to work quite well, despite the fact that I've never programmed in JS. I can't speak for professional software engineering, but for people like me who deal more in data analysis, one-offs, and hobby projects, it's been great.
The CEO's pitching this as a way to replace labor. Yeah, maybe in the middle term. I can't speak to what actually happens in 30-50 years (the timeline for progress that current CEO's want to claim...
Who’s to say that AI won’t create new jobs and opportunities?
The CEO's pitching this as a way to replace labor.
I don’t think it’s particularly useful outside of coding assurance, and won’t disrupt many jobs in the long term.
Yeah, maybe in the middle term. I can't speak to what actually happens in 30-50 years (the timeline for progress that current CEO's want to claim is "only a few years away).
I guess that depends on whether we can find more uses for code or other AI outputs. It seems like excel didn't decrease accountants too much, just increased the amount of accounting we're doing....
I guess that depends on whether we can find more uses for code or other AI outputs. It seems like excel didn't decrease accountants too much, just increased the amount of accounting we're doing. Maybe we don't necessarily decrease the number of programmers so much as increase the amount of programming/coding. I'm seeing some of this happen in research. People are applying coding approaches to a lot more situations than before since the barrier seems to be lower. Who knows how the balance of jobs will shake out a few years from now?
I'm reminded of the post from a bit ago with the perspective that AI models won't let us work less, just do more with less time. It seems to me that this is a pattern that has held up through history. As long as we feel like there are gains to be made, we'll think of jobs for people to do, until AI and robotics clearly outperform us in every aspect. Despite the hype from people like Altman, I don't think we're particularly close to that point so far, although people may need to be adaptable to stay employed, depending on how industries shift.
I firmly believe there are plenty more uses for code so I do think that is the potential future. Many open source projects development is limited by contributors, and often there's people who want...
I firmly believe there are plenty more uses for code so I do think that is the potential future. Many open source projects development is limited by contributors, and often there's people who want to contribute but can't code and don't have a ton of money to fling around so there's very little they can actually do. Not saying those people should contribute by vibe coding, rather I'm saying that I identify those scenarios frequently where demand of coding exceeds supply. And there are tons of niche cases where the average non-coding person could probably come up with an idea for software that would be useful to some, but they can't make it themselves and it's not worth anyone else's time to do it.
Now I don't know that all those potential scenarios will come to fruition even with AI, because there are some limits to vibe coding and there were probably still be some limits to what a developer will spend their time on if a non-dev can't vibe code it themselves. If it's something only 1 person wants, and it takes 10 minutes and they pay $30 for it or something, that might be one case, but if only 1 person wants it and it takes 10 hours, that's an entirely different circumstance.
40-60 years ago, we'd offer retraining programs or early retirement packages to make sure people stay on their feet. They did this because they learned from the 19th century what happens when you...
40-60 years ago, we'd offer retraining programs or early retirement packages to make sure people stay on their feet. They did this because they learned from the 19th century what happens when you don't.
Part of what makes systemic violence systemic is that it's not always that direct. In this case it's not just the fact that the useful product takes away jobs, it's also the fact that there is no...
Any product that does something useful will take away people’s job. That’s almost definitional.
Part of what makes systemic violence systemic is that it's not always that direct.
In this case it's not just the fact that the useful product takes away jobs, it's also the fact that there is no systemic change to offset how that negatively impacts people.
In Sam Altman's case, if he also advocated for UBI programs or assistance for people who get displaced by his product, he'd be far less problematic.
Or if the society we lived in had better programs for addressing that kind of displacement, the product would be seen less systemically violent.
But apparently that's "socialism" and we can't have any of that, can we? We prefer debt, hunger, and homelessness in this country!
And fyi this is coming from someone who advocates for AI as a technology but criticizes the societal structures that make AI dangerous. Every problem people have with AI as a technology can be traced back to systemic failures in our society. The problem isn't the technology, it's the system.
Sometimes he does: OpenAI calls for robot taxes, a public wealth fund, and a 4-day workweek to tackle AI disruption. I don’t think it’s going to satisfy anyone, though, because it’s just talk....
OpenAI CEO Sam Altman has also expressed support for Universal Basic Income, a proposal for recurring cash payments to all adults regardless of wealth or employment.
In May 2024, Altman suggested a new version he dubbed Universal Basic Compute, where people receive a share of AI computing power rather than cash, which they could use, sell, or donate.
I don’t think it’s going to satisfy anyone, though, because it’s just talk. Converting that into something real is up to governments, not the AI companies.
There is a very well-funded foundation, though, and they could do… something?
Nonetheless, OpenAI did finally strike a contortive restructuring deal last October. Essentially, the for-profit arm became what is known as a public benefit corporation (PBC), called the OpenAI Group. The original nonprofit became the OpenAI Foundation, which has a 26 percent stake currently worth $180 billion in the PBC, plus a sliver of exclusive legal control over certain major decisions.
…
The resulting stake of the OpenAI Foundation is big enough to instantly make it one of the wealthiest charities in the country, or in OpenAI’s words, the “best-equipped nonprofit the world has ever seen.” On paper, at least, the foundation is now significantly richer than the entire country of Luxembourg. Even the Gates Foundation has only $77.6 billion in assets, less than half of what the OpenAI Foundation can draw from, though it’s important to note that most of the wealth of the OpenAI Foundation is locked in fairly illiquid shares within the still private company, which limits how quickly any money can be given away.
You know, that's actually really interesting. I'm going to read those and do some more research into that. Thank you. I want to see if they end up putting their money where their mouth is.
You know, that's actually really interesting. I'm going to read those and do some more research into that. Thank you.
I want to see if they end up putting their money where their mouth is.
Sometimes there's a middle ground, like virtual adding machine software from 1993 that mimics the electronic adding machines to which accountants were accustomed—machines which were themselves...
Sometimes there's a middle ground, like virtual adding machine software from 1993 that mimics the electronic adding machines to which accountants were accustomed—machines which were themselves heavily inspired by earlier mechanical adding machines.
I am reminded... (The golem speaks with capital letters) Killing less than 2.338 people gets the death penalty in some states. And yet we're so much more accommodating of those who kill indirectly...
I am reminded... (The golem speaks with capital letters)
Do you understand what I'm saying?"
shouted Moist. "You can't just go around killing people!"
"Why Not? You Do." The golem lowered his arm.
"What?" snapped Moist. "I do not! Who told you that?"
"I Worked It Out. You Have Killed Two Point Three Three Eight People," said the golem calmly.
"I have never laid a finger on anyone in my life, Mr Pump. I may be–– all the things you know I am, but I am not a killer! I have never so much as drawn a sword!"
"No, You Have Not. But You Have Stolen, Embezzled, Defrauded And Swindled Without Discrimination, Mr Lipvig. You Have Ruined Businesses And Destroyed Jobs. When Banks Fail, It Is Seldom Bankers Who Starve. Your Actions Have Taken Money From Those Who Had Little Enough To Begin With. In A Myriad Small Ways You Have Hastened The Deaths Of Many. You Do Not Know Them. You Did Not See Them Bleed. But You Snatched Bread From Their Mouths And Tore Clothes From Their Backs. For Sport, Mr Lipvig. For Sport. For The Joy Of The Game.
Killing less than 2.338 people gets the death penalty in some states. And yet we're so much more accommodating of those who kill indirectly if they're making other people a lot of money in the process.
Hell, now you can bet on those deaths on Polymarket at the same time. It's fine!
Of course not. (I recognize that you're positing a devil's advocate argument.) By this logic, if you and I are competing for a job and I get it, that would also be violence. It would be worse,...
Is that not just violence with extra steps?
Of course not. (I recognize that you're positing a devil's advocate argument.) By this logic, if you and I are competing for a job and I get it, that would also be violence. It would be worse, actually, since Altman is much farther removed from you and I'm competing with you directly.
Me getting a job over someone else doesn't mean that other person can't get another job. Altman's actions can definitly be argued to directly affect you, regardless of if you're an AI engineer,...
Me getting a job over someone else doesn't mean that other person can't get another job.
Altman's actions can definitly be argued to directly affect you, regardless of if you're an AI engineer, work in a differernt sector entirely, or are simply located in a place next to a data center
Sadly what's actual happening is people are instead convincing people that it's the immigrants, wind turbines, trans kids playing sports, bike lanes etc... causing their problems and not the...
Sadly what's actual happening is people are instead convincing people that it's the immigrants, wind turbines, trans kids playing sports, bike lanes etc... causing their problems and not the unchecked neoliberalism, then using that to push for even more unregulated capitalism.
Oh, I'm definitely not advocating for violence either. But I can see why there has been so much anger. A socialist revival in politics would be the good outcome...
Oh, I'm definitely not advocating for violence either. But I can see why there has been so much anger.
A socialist revival in politics would be the good outcome...
Sadly, yes. I wasn't make Bane posts on other parts of the web from the warehouse fires because I approve. It's because if you read history of this from the guilded age some 100-120 years ago,...
Sadly, yes. I wasn't make Bane posts on other parts of the web from the warehouse fires because I approve. It's because if you read history of this from the guilded age some 100-120 years ago, we're going through the same steps all over again.
A cornered rat strikes back and we don't have the Skynet robot army to stop that as of now
What frustrates me is that replacing people with robots at said jobs should be a good thing. We want to strive for a world where we don't have to work on things we don't need, so we can work on...
What frustrates me is that replacing people with robots at said jobs should be a good thing. We want to strive for a world where we don't have to work on things we don't need, so we can work on things that matter to us. That's the life-long dream.
In a functioning society, robots doing all the work would mean people being more free, not less.
But in a capitalist society, robots replacing you just pushes you into worse and worse jobs to further help the capitalist class hoard resources, leaving you with even less in the end.
So as it stands, people will just attack the robots and their owners because they know there are no social safety nets otherwise. And I don't blame them, their government has failed them.
I think AI is overall a bad thing. The sheer amount of LLM-generated slop littering Reddit, Instagram, Facebook, YouTube and TikTok these days shows just how spot-on the Dead Internet Theory is....
I think AI is overall a bad thing. The sheer amount of LLM-generated slop littering Reddit, Instagram, Facebook, YouTube and TikTok these days shows just how spot-on the Dead Internet Theory is.
One example of how prevalent AI has become is when you look at pub advertising on social media. Almost every pub I follow uses AI to generate images in their posts, and while I don't blame them for doing this rather than hire actual graphic artists and professional photographers (labour is expensive and pubs are struggling), it leads to glaring inaccuracies like seeing a post advertising televised darts matches with completely incorrect numbers, or even random letters along the rim of the dartboard. Or... AI-depictions of pro players wearing jerseys that have complete gibberish sponsor logos.
And don't get me started on the blatant Ghiblified cartoons and piss-filters I see on a lot of AI images.
But in a capitalist society, robots replacing you just pushes you into worse and worse jobs to further help the capitalist class hoard resources, leaving you with even less in the end.
Definitely my experience, although being pushed into a worse job is a hopeful outcome. I was laid off from a commercial reporting job last year and struggled for months to even find anything in accountancy and finance. AI is a very big reason why the job market is in such shambles. If entry level programming jobs are practically nonexistent these days thanks to Anthropic and Claude Code, imagine how it is for any office-based job that doesn't require you to write code...
The most code I've ever written in a professional setting has been the occasional VBA script to automate a tedious task. Otherwise, much of my work involves basic data entry in an accountancy system and using formulas and pivot tables in Excel. Claude could probably do about 90% of my previous job.
I have genuinely questioned whether I should continue studying ACCA or pivot towards a different field entirely. But this would be the second time I've gone into higher/further education and ended up with my academic qualifications becoming worthless. And I don't know what I could even do that is AI-proof. I didn't spend years and lots of money studying degrees and professional qualifications to work in a warehouse or cleaning offices for minimum wage...
In a functioning society, robots doing all the work would mean people being more free, not less.
We unfortunately do not live in such a society. We need money in order to acquire food, drink, shelter and everything else we need to survive. Cut off the source of money and people end up starving.
UBI is often touted as a solution but I think it's a socialist pipe dream at best. Any nation that implements it will either go bankrupt or pay a pittance that doesn't help anyone. The big question is how will anyone even fund it? We can't even make billionaires and big businesses pay tax at the moment - what the hell makes people think they'll suddenly cough up when unemployment soars?
I think the next few years are going to get incredibly ugly. Either the AI bubble bursts and crashes global stock markets, or entire industries of people get replaced and we end up with mass civil unrest, which could end up being suppressed brutally, especially if AI ends up being implemented in autonomous weaponry.
AI is being used as another tool to squeeze the working class for everything they can get out of us. It itself isn't the problem (at least intrinsically.) It's just another cog in the orphan...
AI is being used as another tool to squeeze the working class for everything they can get out of us. It itself isn't the problem (at least intrinsically.) It's just another cog in the orphan crushing machine, albeit it a particularly large and annoying one.
Technology in general has always being harnessed by the powerful to subjugate the masses. It's getting terrifyingly efficient at it now though. They can know where you live, when you sleep, what you do, who you talk to, what you eat, your medical records, your purchasing habits, listen in on your conversations, and use any of that information however they want to.
regarding your Dead Internet Theory thought, it has been very sad to see so many communities "polluted" by LLM output text. Used to be, writing coherent longform text was enough of a barrier that...
regarding your Dead Internet Theory thought, it has been very sad to see so many communities "polluted" by LLM output text. Used to be, writing coherent longform text was enough of a barrier that most spam was obvious. But now, for example, an Amazon review farm can use LLMs to create dozens of pretty credible, detailed 4 and 5 star reviews. You'll only know when the product turns out to be dogshit.
That's to say nothing of the LLM spam on forums these days, I get pretty annoyed when I realize I'm reading a few bullet points bloated out into multiple paragraphs. Blog posts, too.
It's very sad to see this tragedy of the commons playing out in real time. It honestly makes me want to totally drop the internet some days; I'm here to meet with minds, not lap up slop.
To starve and to be homeless and all that as we cut public assistance. Even job retraining resources get cut drastically. So losing the job and the field makes it feel hopeless, not like there's a...
To starve and to be homeless and all that as we cut public assistance. Even job retraining resources get cut drastically. So losing the job and the field makes it feel hopeless, not like there's a great big beautiful hopeful figure out there.
Sam's reaction blog post to this is pretty interesting: https://blog.samaltman.com/2279512 A paragraph explaining how his family is the most important thing in his life. Then 10x that amount of...
A paragraph explaining how his family is the most important thing in his life.
Then 10x that amount of text for... yet another AI sales pitch? And, in classic tech bro fashion, a Lord of the Rings reference that, much like Peter Thiel's Palantir, feels like either a malicious interpretation of the source text or the ravings of someone who didn't actually read the books:
“Once you see AGI you can’t unsee it.” It has a real "ring of power” dynamic to it, and makes people do crazy things. I don’t mean that AGI is the ring itself, but instead the totalizing philosophy of “being the one to control AGI”.
The only solution I can come up with is to orient towards sharing the technology with people broadly, and for no one to have the ring.
First: is the AGI in the room with us right now, Sam? How did we go from "attacks on my home make me sad" to "AGI AGI AGI" this fast?
Second: if controlling AGI is the One Ring (if Sam had read the books, he might have learned that there are quite a few Rings of Power, but I digress), is Sam Sauron in his own analogy? And we're supposed to trust Sauron's selfless gift to all of the free peoples of the earth? What could possibly go wrong!
Maybe the scariest thing about this analogy is that it implies that Sam thinks he's as personable as Sauron was before he revealed his true nature. Quite the ego.
On one hand, obviously nobody should be throwing molotovs at anyone else's house. That is a bad thing. On the other hand, a guy who can't separate the safety of his family from a sales pitch and one of the worst LotR analogies I've ever read... is not exactly easy to empathize with.
Even ignoring his sales pitch in there, it's impossible to take him seriously because he offers nothing but platitudes and nothing of substance or specificity. He won't actually stand up for...
Even ignoring his sales pitch in there, it's impossible to take him seriously because he offers nothing but platitudes and nothing of substance or specificity. He won't actually stand up for anything. He acts like he understands the frustration people have with him, with the company he works for, with the industry he works within, but he can't actually detail or specify the root causes of the problems or the solutions.
He mentions how democracy needs to stay a democracy and companies shouldn't capture the power, without mentioning or addressing how it's widely believed to already be corrupted, that companies have all the power and democracy is non-existent. What is Sam Altman doing to help? I mean aside from the pitch for OpenAI here, it was also supposed to be a pitch for Sam Altman, the guy who is on 'our' side, that he understands the plight of others who might molotov cocktail his house, but the most he can offer is a line or two about having a family. He needs to only look around at all the people getting crushed by the system to see that having a family doesn't protect anyone, so he might want to find a better pitch than that.
The pump is nearing its end and the dump is coming. The business is still lighting money on fire, and there have been noises of an IPO. Going public is the classic way to find a bunch of...
Then 10x that amount of text for... yet another AI sales pitch?
The pump is nearing its end and the dump is coming. The business is still lighting money on fire, and there have been noises of an IPO. Going public is the classic way to find a bunch of bag-holders and jump into a life boat before the ship goes down. (And every media opportunity is a chance to keep the hype alive.)
Meanwhile, datacenters are being put on hold or cancelled from local pushback, energy costs due to the invasion of Iran, hardware shortages, transformer shortages, etc..
You've got growth stalling, ludicrous amounts of debt and no realistic way to ever pay it back. Except a trillion dollar valuation and maybe enough suckers to buy shares. From there, it's either cut-and-run time or we'll end up with another idiot meme stock.
I'm also not surprised the owner of a machine people use to avoid reading, writing and thinking doesn't grasp the basics of a reasonably challenging piece of literature.
Really, I have to wonder whether he was always like that - or whether constantly talking with an overly agreeable language model makes someone self-centered in that way. Not joking about that...
Really, I have to wonder whether he was always like that - or whether constantly talking with an overly agreeable language model makes someone self-centered in that way.
Not joking about that either, delusions are well known by now but that's just an extreme that might be easier to identify.
What they're describing is more of a a Yes Man feeding your narcissism all the time. Psychosis would be the much higher level of disconnect from reality. Personally, I suspect AI helped him write...
What they're describing is more of a a Yes Man feeding your narcissism all the time. Psychosis would be the much higher level of disconnect from reality.
Personally, I suspect AI helped him write that post to the first place so it doesn't surprise me that it sounds incredibly corporate and "linked in" cuz that's what it would be matching.
They're calling the guy who set the "original" fire "Paper Mario" and putting him in memes alongside Luigi. Also saw someone say "so when I ask my boss for a raise should I reference that made $2...
They're calling the guy who set the "original" fire "Paper Mario" and putting him in memes alongside Luigi.
Also saw someone say "so when I ask my boss for a raise should I reference that made $2 more an hour than me?"
Idk that it is a sign of revolution or a sign of deteriorating social norms and increased violence, or both. The general increase (or perceived) in violence and aggression in customer service, Jan 6, a bunch of things point to us being less and less civil.
The link to Jan 6th is interesting. Intentionally or otherwise, Trump has somewhat normalized (and even legitimized and pardoned!) physical violence against the ruling class. One wonders if that...
The link to Jan 6th is interesting. Intentionally or otherwise, Trump has somewhat normalized (and even legitimized and pardoned!) physical violence against the ruling class. One wonders if that could ultimately be his downfall.
I firmly believe the increase in violence in schools (fights and such) is tied to the normalization of both violent language and threats from the Trump administration over the past 10 years
I firmly believe the increase in violence in schools (fights and such) is tied to the normalization of both violent language and threats from the Trump administration over the past 10 years
Paper Mario is a great one haha. I have no idea either, but seeing more and more of this desperation makes me wonder how much of it will keep happening until something changes.
Paper Mario is a great one haha.
I have no idea either, but seeing more and more of this desperation makes me wonder how much of it will keep happening until something changes.
Mario and Luigi are both characters from the same franchise, there are some games in that franchise titled "Paper Mario", and the warehouse that was set on fire was a paper goods facility.
Mario and Luigi are both characters from the same franchise, there are some games in that franchise titled "Paper Mario", and the warehouse that was set on fire was a paper goods facility.
Also people have been "joking" about wanting a "Player Two" or a "Mario" alongside Luigi online for a while. Some less wanting and more expecting it to be inevitable. So when someone else flags as...
Also people have been "joking" about wanting a "Player Two" or a "Mario" alongside Luigi online for a while. Some less wanting and more expecting it to be inevitable. So when someone else flags as a symbol of class consciousness while allegedly doing crime, the door was already open. The paper part is just convenient.
The other piece of the puzzle that folks haven't mentioned is that people are connecting this guy to Luigi Mangione, the man accused of assassinating a heal insurance CEO
The other piece of the puzzle that folks haven't mentioned is that people are connecting this guy to Luigi Mangione, the man accused of assassinating a heal insurance CEO
Gotcha there are multiple paper Mario games which play with dancing between 2 and 3 dimensions and the idea of folding things up into new shapes. One of them was the spiritual successor to super...
Gotcha there are multiple paper Mario games which play with dancing between 2 and 3 dimensions and the idea of folding things up into new shapes. One of them was the spiritual successor to super Mario RPG IMO but they've all been fun.
There's no other connection though, it's just a reference
I'm no expert, but I feel like there needs to be some amount of organization in order to be a revolution. Either I'm not cool enough to know who those people are (very possible), or they don't...
I'm no expert, but I feel like there needs to be some amount of organization in order to be a revolution. Either I'm not cool enough to know who those people are (very possible), or they don't exist (also very possible).
“I have to admit I felt a little weird as I prepared to toss this flaming incendiary device through [Altman’s] front window, but the recipe explicitly stated that this was an essential step to get that creamy, velvety risotto texture. I guess I didn’t know any better. I mean, I’ve never made risotto before.
It's getting rough out there, y'all. There's a big gaming-adjacent forum I used to frequent, ResetEra, which always billed itself as a cut above others in terms of being civil and progressive. It...
It's getting rough out there, y'all.
There's a big gaming-adjacent forum I used to frequent, ResetEra, which always billed itself as a cut above others in terms of being civil and progressive. It was actually formed when the mods and most users of the website NeoGAF jumped ship in protest after the controversial owner got #MeToo'd.
They used to handle the AI discourse about as reasonably as Tildes does, but a couple years ago some extremely anti-tech faction of the mod team successfully lobbied for policies purporting to sharply limit AI discussion, which they promptly started abusing to smother informative or newsworthy threads while letting low-effort ragebait on the topic proliferate. As somebody with very mixed feelings on the technology who nevertheless enjoys talking about its evolution and implications, this was a very frustrating development.
Recently there were multiple threads about violence directed at people associated with the industry, including the now-multiple attacks on Altman's home and gunfire shot at the house of an Indiana politician who backed data center construction. Both were filled with comments glorifying the violence and openly hoping for more. Despite being flagrant violations of site rules, the posts stayed open for days and reports about them were ignored. When I finally called out the moderation failure in the site's official meta thread and asked why they were tolerating people advocating murder, they permabanned me instead.
I don't consider it a great loss, since I'd been spending minimal time there lately due to the mods flouting their own rules like that more and more often. Revisiting any thread more than a year or two old invariably shows a graveyard of normal-seeming commenters whose accounts have since been banned (or self-deleted in protest) for crossing the mods one way or another. But it says something that one of the last big progressive forums with a solidly 30+ progressive audience is being infected by this kind of bloodthirsty rhetoric, and that the people in charge are less inclined to punish people braying for murder than they are the people accusing them of tolerating calls for violence. You kind of expect to see that on social media these days, but part of me hoped that older-school places with alleged standards would hold the line better.
I wonder if the internet is making it harder to organize against those in power because it provides a weak local maximum to our sense of belonging. If these people were talking about AI with their...
I wonder if the internet is making it harder to organize against those in power because it provides a weak local maximum to our sense of belonging. If these people were talking about AI with their local unions then they’d actually have people to march with.
I think it's only a matter of time until another major CEO gets assassinated.
Anti-capitalist sentiment has been rapidly growing over the years. Just a few days ago, a disgruntled worker in California who compared himself to Luigi Mangione set fire to a warehouse, destroying roughly $500 million worth of goods.
Leading a company in an industry that strives to replace workers with robots and drive mass unemployment is a sure-fire way to be deeply unpopular, especially when their antics have skyrocketed RAM prices and done serious harm to the consumer electronics market.
While I don’t advocate for violence, I think America Is overdue for a socialist revival. Probably the rest of the world too. The world is not in a good place right now.
IMO that's the opposite of advocating for violence. If people aren't given a legitimate political avenue to meet their needs, they will just act outside the political system. A socialist revival would prevent the kind of desperation that leads to crazy acts like this.
Yeah I think we need to redefine what "violence" is in these dynamics and define a new term to replace it: "systemic self defense."
Because "violence" is not always of the simplistic definition "human hit other human," it's systemic violence like threatening jobs and livelihoods of people who will now have to struggle to survive because of those decisions. It's creating a healthcare industry that thrives when it denies lifesaving care of it's members.
I think reactions to that kind of systemic violence after being ignored in the proper systemic avenues for change shouldn't be considered violence, it's systemic self defense.
Just like with more physical violence, we shouldn't blame the victim for defending themselves, and people shouldn't be accused of "condoning violence" for supporting someone who defended themselves either. And once again, that same concept should be applied to systemic violence as well.
I completely disagree. I could get on board with the notion that a government policy threatening jobs and livelihoods constituted systemic violence, because the government inherently represents a threat of force. But accusing a CEO of systemic violence because his company produces a product that might lead to job loss is provocative and dangerous. Moreso because of the subsequent argument that people shouldn't be blamed for defending themselves against such "violence."
Is John Deere committing systemic violence by reducing the job market for manual farm workers? Did Apple commit systemic violence when the first Macintosh decimated the job market for manual ledger-keepers? Is a layoff because a company is going under systemic violence? Is it when I get fired for poor performance? What if I'm getting fired from the only job I can reasonably get and now my family will starve?
The definition of violence must involve force or the threat thereof. It cannot just be threatening jobs by creating technological advances. Not just to remain logically coherent, but to prevent the definition from being turned against anything and everything that we happen to disagree with.
There's a lot more nuance that needs to go into discussions of systemic violence.
I'll try to go point by point.
So the nuance here is that there’s active lobbying happening from AI companies right now. Companies like Google and Meta are influencing policy, but not necessarily toward programs that meaningfully offset displacement. That context matters because policy outcomes don’t exist in a vacuum, they’re shaped by those pressures.
They're creating technologies that cause problems without advocating for any solutions.
Not even just on an ethical ground, but even an economic one too. Because if that many people get displaced then how will those people add to the economy and afford to buy the products of the companies that used AI to save money? That's another possible hard limit to this entire "infinite growth/profit" that can be reached if people aren't thinking about it.
That doesn't mean we shouldn't also criticize the government and policy makers, but all these systems are interconnected due to the way our society is currently structured.
Granted, /u/skybrian in their responses have pointed me towards articles that might disprove or alter that opinion, and I have yet to do enough research into it to officially change my opinion. It very well could change given this new information, but I'm going to respond from where I'm at now. I don't yet know if this "call" is just an interview soundbyte or if it's an actual program they're actively lobbying and advocating for.
Haha Actually yes to absolutely every single example you gave, everything you just listed is the exact example of systemic violence, it's just accepted as a normal part of a capitalist system.
Anything systemic that threatens the survival of an individual is systemic violence.
When you come at this issue from the angle that every person deserves a baseline level of means to survive, so that the idea of being jobless isn't a death sentence to many people(via losing health insurance, not being able to afford medication, food insecurity, mental health issues, etc), then anything in a system that threatens an individual's survival is violence against them.
I mean at that point it's just semantics. Systemic violence is just the accepted term for what we're talking about, which as a concept does exist. I don't define these things, I just use the commonly accepted term for the concept I'm describing.
With that being said, as a concept Systemic/Structural violence does exist, so I'm not really the one who holds the keys to the definition, so my hands are tied. Unfortunately it seems that you're two years too late to take up that semantic debate with Johan Galtung who coined the phrase.
Then the term is so broad as to be completely meaningless and discussing it is useless, no? In any event I again reject the concept that violence can occur without force.
Respectfully, this is pretty silly. You invoked the term. If I brought up psychoanalysis and you said "that's an absurd idea," it'd be rather ridiculous of me to go "hey, it's a concept that exists, my hands are tied, take it up with Freud, too bad he's dead" as if it was some sort of counterargument.
I mean you can not like the practice of Psychoanalysis, but if your objection is that you're not psycho so the language doesn't make sense then "Talk with Freud about it" is not an unreasonable response if it remains a snarky one.
If someone is using an operationalized definition, telling them the name is stupid doesn't really argue against anything other than the name. If you're objecting to the concept more broadly then I think it makes sense to focus less on the semantics.
IMO the term seems well defined, it's just like many academic terms intended to be considered in those spaces and with that underpinning of theory.
Layoffs cause harm - layoffs because shareholders want to make more money and so employees who used to be able to expect a stable living and a pension or at least a severance package show up to the doors being locked and unemployment being over a week away.... How is this not systemic harm? Because that harm didn't break bones is it non violent?
This is a struggle I have with the term violence vs harm myself but if defined for the purposes of a conversation (such as with an academic term), it's not productive to focus on the definition IMO.
I’m not sure which term you're calling "too broad", "violence" generally, or "structural violence" specifically.
Because structural violence has a defined meaning. It’s not just "anything bad", it refers to systems that produce harm by limiting access to basic needs.
If you’re rejecting the idea that violence can exist without direct force, then you’re rejecting that broader definition entirely. That’s fine, but then the disagreement is about the definition of the word "violence" and not whether the concept is meaningless.
Also, out of curiosity, would you also consider emotional abuse not to be a form of violence just because force isn't used? Would you disagree that the term "domestic violence" encompasses the concept of emotional abuse?
I'm just saying you’re getting hung up on the word "violence" and it's technical definition, and I don't have any intentions to debate the definition of a word when I'm talking about the concept the term represents.
Because if the word "violence" is the sticking point here, we can set it aside and call it whatever you want and move on. We can invent a completely new term and call it "Structural Komblartamy" from here on out if that helps us move forward.
But we’d still be talking about the same thing: a system where losing a job can mean losing healthcare, where access to survival resources is tied to money, which is tied to employment, and where that creates predictable harm for people who can't realistically opt out of that system when jobs are removed.
I get where you're coming from and this is a reasonable argument, but I disagree. This is going to be a very tangled argument because I recognize you don't want to debate definitions when it's the concept that's important, but the definition is integral to the concept. If you bear with me, I'll try to prove it. After that, I'll discuss the concept of systemic issues more broadly.
The use of the word violence is key to both the meaningfulness of the concept itself and the way you're using it. And I think the way you're using it demonstrates my point, actually. To expand, you wrote
None of this really makes sense if we don't take structural violence to be a form of violence as commonly understood (e.g. with the threat of force and all that). And clearly you do and that's where we disagree, but my point here is that the categorization as violence is inherent to everything that we're discussing. Similarly, there's a reason that Galtung didn't call the concept structural komblartamy in the first place, right? He chose the word violence specifically to evoke the normal definition of the word violence. I mean, to quote his original work explicitly,
So... yeah. : )
This matters because of exactly what I disagreed with in the first place. Violence begets violence. Violence justifies violence. Words and ideas have power, a lot of power, and categorizing things ranging from job loss to dying of tuberculosis as violence justifies a violent - i.e. forceful - reaction to them. Which was your point in the first place, no? That a violent reaction would only be self-defense?
Put simply, I disagree. Vehemently.
Poverty is bad. People losing their jobs is bad. People dying of tuberculosis (I mention it again because that was one of Galtung's examples) is bad. Our civilization produces these things, yes. All of them should be worked against. But how we conceive of them - the causes we observe, the labels we apply - defines how we combat them. If we think of them as structural ills, we create new policies to reduce them. If we think of them as violence, we shoot CEOs in the street. Or, in this case, try to burn down their homes.
All of those things should be worked against. But a violent response to them is neither morally justified nor practically effective.
For the former, no, I wouldn't. That doesn't mean emotional abuse isn't bad, obviously, but I think the word abuse characterizes what is happening. For the latter, the US federal definition of domestic violence, which I think is pretty good, does encompass emotional abuse, but - and this is the important part - it must be part of a pattern intended to control or intimidate the victim. Logically it follows that emotional abuse is not in and of itself sufficient to constitute violence; it's the pattern of intimidation and control that constitutes violence. That's a little broader than I would go - I think the use or threat of force is integral - but it's closer to my definition than not.
Probably my biggest issue with the concept of systemic violence. If I’m having systemic violence perpetrated against me, does that mean I can employ self-defense against said system? Against people in said system? Or is the argument going to be that said resistance must be collective and that individual resistance is wrong?
Devil's advocate: "AI" CEOs like Altman are actively taking away jobs, which effectively means people's ability to feed and clothe and house themselves, and any meaningful access to healthcare. Is that not just violence with extra steps? Sure, Sam isn't locking me in a room without food and water. But the effective output of his company accomplishes the same thing.
Any product that does something useful will take away people’s job. That’s almost definitional.
VisiCalc’s release as the first spreadsheet software entirely destroyed an entire field of accountants. There are still accountants today, of course, but they do something entirely different.
I don't think that definition is even loosely true. I can think of literally hundreds of products that don't take away someone's job (bicycles, food, shoes, toothbrushes, podcasts, video games, my favorite text editor, my favorite IDE, etc etc etc). I guess that definition holds if you assume a zero-sum world where no product can create new opportunities and abilities that didn't exist before?
The difference this time is the order of magnitude of jobs AI boosters are threatening to take away. Literally all white collar work (including the jobs of the AI boosters, mind you) is fair game. If AI can do varied white collar jobs, it can likely also manage itself with vanishingly few humans in the loop. I'm not sure any technical innovation has threatened a gun to the head of so many careers simultaneously before; even the industrial revolution typically replaced one job at a time with a dedicated machine like the loom, sewing machine, or the tractor.
Growing food more efficiently can certainly result in jobs being lost in agriculture. Who do you think did the work before farmers had giant machines to do it? Considering that at one time nearly all people were peasant farmers, it's hard to see how AI could get rid of more jobs than the Industrial Revolution.
A similar argument could be made for basically everything we buy.
Of course, often people end up working different jobs and that's usually considered progress.
Using AI to automate tasks doesn't seem all that different in kind. It might be different because it happens a lot faster. It still seems too soon to say if that's really going to happen, but it has a lot of people worried. What would the new jobs be? Who would be qualified to do them?
I very much agree with this thought. I'm honestly quite excited to see the possibility of that AI could do to change the way of working. Maybe it'll really not have much of an impact and that's not a big deal, that's fine.
Maybe it'll have a much larger impact and we'll see white collar work change in a large way.
But yes, this has happened so many times before to humanity, we've had huge shifts in how we work and we always find new things to do.
I already responded to you up higher, and maybe I should wait for a response before responding to something else you said, but I did want to address this.
So this is a very VERY capitalistic view on the situation. Because who says there needs to be new jobs?
Maybe I'm crazy, but I've always believed that capitalism is unsustainable, but that it was neccesary as a means to an end. Because true capitalism requires infinite growth and market value as drivers of progress. But there's always going to be a ceiling for that system, and growth is not infinite. Like the concept of inflation is managable when you're looking at it in the span of a dozen years, but zoom out decades based on ideal growth and inflation and what are we looking at? Zoom out to 2080, are we looking at $500 gas and minimum wage being set to $350/hr at that point?
Capitalism was a great motivator and pusher of progress to get us to the point of technological and societal progress that every human on the planet can be afforded a decent living without overt struggle.
We're at that point. And AI can absolutely help with that, but AI and Capitalism is a mixture that won't end well for the average person.
Way back before it was called AI, when I just called it automation, and I saw the writing on the wall the moment they added self-checkouts to grocery stores. I was advocating for the idea of taxing corporations that use automation in a way that displaces human workers, and to use that tax revenue to fund a type of universal basic income for those who get displaced as well as for training programs and education for different trades that are needed. A UBI that let a displaced person live off of without going hungry or losing their homes, but also motivate them to find work or training in other needed sectors in order to make more money than UBI gave them.
The idea of the tax is that it also leveled the playing field a bit in a way that would offset the financial benefits of replacing humans and instead turn it into efficiency benefits instead. Advocating for a surgical use of automation instead of mass layoffs on a whim.
And eventually maybe people being displaced could have focused on pursuing education or personal enlightenment. But unfortunately under capitalism the only ones who get to do that are the CEOs, Shareholders, political elite, and billionaires.
Unfortunately none of that happened and here we are.
I’m in favor of UBI. But regarding new jobs, the question is whether there is more work that could be done if there were people were willing to pay for it and the systems in place to make that work viable.
When people aren’t worrying about AI, sometimes they worry about demographics - more retirees to take care of and fewer workers to support them. Maybe that’s what productivity improvements get “spent” on? In some countries, anyway.
I’ve come to think these days that the problem is not the lack of workers, but with the ideas driving society itself.
In my grandmother’s final years she had a number of nurses. Many of them were Filipino, because they were “naturally caring”. Uncomfortable racialism aside, it assumes the question of what a caring society looks like. Why it might be a problem that needs more than labor to take care of. If we did have a labor-independent society, maybe family would be free to stop working their jobs and take care of their elders. Maybe the volunteers who are currently helping elders will be free to take more direct actions to help out. I don’t think we would have to worry as much about elder care in terms of paid labor.
The trouble is that the alternative to paid labor is unpaid labor. There are situations when volunteers can play a role, but it would be unethical to expect nurses to go unpaid.
Traditionally, people relied on unpaid labor by family members. And I'm living that because my mother came to stay with us for the winter and I expect to do more of that. It works for us.
Except, it's not going to work for my wife and I because we don't have kids. It would be both unworkable and unethical to attempt to rely on volunteers when we get old.
I do hope we will be able to find caring people to help us, but I also expect to pay them.
Those each are alternatives to different professions. Stagecoaches, farmers (from efficiency gains), dentists, 🎵TV killed the radio stars, board games, punch card maintainers, programmers who write assembly.
Debateable. Are toothbrushes really an alternative to dentistry? I suppose in the medieval sense of dentists as "people who remove rotten teeth", but if anything I would think modern dental care actually brings more business to dentists through crowns, fillings, and cleanings?
I think the idea that innovations inevitably displace existing industry is a tad reductionist, is all.
It doesn’t seem particularly controversial that if you take care of your teeth, you can avoid a lot of expensive dental work.
You still need to see a dentist regularly and may need them regardless of how well you take care of your teeth, because shit happens.
That's a bad example IMO.
I used to get a lot of fillings due to poor oral hygiene, maybe one or two a year. This meant longer sessions at the dentist or sometimes a separate appointment to perform the extra work. Then I got gingivitis, which scared me into consistently brushing and flossing ever since. Haven't needed a new filling in almost 20 years. My dentist has even suggested that I don't actually need to go to them twice a year since they usually don't find much. Although I still go to the dentist twice a year, fillings and other dental work would be a significant addition to the typical maintenance work.
Toothbrushes obviously aren't an alternative to ALL dentistry, but I'm sure that the fact that toothbrushes are cheap and widely available means less work for dentists in aggregate. If we didn't have toothbrushes we'd need more dentists. That's just in a vacuum though. I think the number of dentists in reality is more influenced by economics and culture than any measure of overall dental health in the population.
Maybe, but toothbrushes are older than modern dentistry (although ancient dentistry and ancient tooth care tools like chew sticks are both thousands of years old.). I don't think it's a good example of an invention that took jobs given that very long and complex global history. And even if it's just thinking about western tooth care you're looking at over 400 years of toothbrushes.
Who’s to say that AI won’t create new jobs and opportunities? Excel turned accountants from people who manually did calculations in a book to people who wrangle excel.
To be clear, I don’t think there’ll be a lot of ChatGPT wrangling jobs, but that’s because I don’t think it’s particularly useful outside of coding assurance, and won’t disrupt many jobs in the long term.
That's a good point. Right now the AI boosters aren't doing a good job of showing where these "gaps" in AI ability might actually exist, mostly because they're optimists who refuse to acknowledge the shortcomings of AI. IMO LLMs aren't likely to do much autonomous work, but I'm definitely not excited at the prospect of a gig where I babysit and fact-check a dozen or more LLMs who do my old job worse than me.
If LLMs are to disrupt industry in a way that reflects the amount of money being invested, you won't have to fact check them.
They're not very useful in their current limited, naive implementation states. They're useful in custom built agentic workflows where they're able to fact check themselves though.
My personal experience is that they have gotten better, but improvements have become incremental. If I had to guess, I think the technology will land somewhere between the confident bullshit generator we have now, and the AGI utopian future all of the leaders of these companies are promising.
That means that the new jobs that best adapt to this technology are going to be people who are skilled at building these agentic workflows; curating proprietary data and tying them to RAGs, and having multiple models which run verification steps and build tests on the fly to overcome their bullshit tendencies.
I imagine that a team inside a company that can find problems throughout the business, and solve those problems much more rapidly than a traditional internal dev team could using low code AI agent platforms would be really valuable for a lot of businesses.
The actual devs will continue doing actual dev stuff and won't have to worry about new project requests to build some crud app for some random HR team or whatever anymore.
Yeah, I've been using Google Antigravity for a few projects, and I'm impressed. It's significantly better than trying to code by just having a conversation with a chatbot and copy pasting in and out. At its core, it's not that much different from a manual chatbot approach, but the ease of use and automation of gruntwork feels significant to me. It's not a magic bullet. I still have to iterate on the roadmap, then iterate on the design docs for each element on the roadmap, then iterate on the implementation plans for each element on the design docs, then iterate on corrections for shifting goals or things that went wrong. However, you can get to some good results on platforms/languages you don't know just by knowing fundamentals of programming and software engineering. I made a Chrome extension recently that seems to work quite well, despite the fact that I've never programmed in JS. I can't speak for professional software engineering, but for people like me who deal more in data analysis, one-offs, and hobby projects, it's been great.
The CEO's pitching this as a way to replace labor.
Yeah, maybe in the middle term. I can't speak to what actually happens in 30-50 years (the timeline for progress that current CEO's want to claim is "only a few years away).
I guess that depends on whether we can find more uses for code or other AI outputs. It seems like excel didn't decrease accountants too much, just increased the amount of accounting we're doing. Maybe we don't necessarily decrease the number of programmers so much as increase the amount of programming/coding. I'm seeing some of this happen in research. People are applying coding approaches to a lot more situations than before since the barrier seems to be lower. Who knows how the balance of jobs will shake out a few years from now?
I'm reminded of the post from a bit ago with the perspective that AI models won't let us work less, just do more with less time. It seems to me that this is a pattern that has held up through history. As long as we feel like there are gains to be made, we'll think of jobs for people to do, until AI and robotics clearly outperform us in every aspect. Despite the hype from people like Altman, I don't think we're particularly close to that point so far, although people may need to be adaptable to stay employed, depending on how industries shift.
I firmly believe there are plenty more uses for code so I do think that is the potential future. Many open source projects development is limited by contributors, and often there's people who want to contribute but can't code and don't have a ton of money to fling around so there's very little they can actually do. Not saying those people should contribute by vibe coding, rather I'm saying that I identify those scenarios frequently where demand of coding exceeds supply. And there are tons of niche cases where the average non-coding person could probably come up with an idea for software that would be useful to some, but they can't make it themselves and it's not worth anyone else's time to do it.
Now I don't know that all those potential scenarios will come to fruition even with AI, because there are some limits to vibe coding and there were probably still be some limits to what a developer will spend their time on if a non-dev can't vibe code it themselves. If it's something only 1 person wants, and it takes 10 minutes and they pay $30 for it or something, that might be one case, but if only 1 person wants it and it takes 10 hours, that's an entirely different circumstance.
40-60 years ago, we'd offer retraining programs or early retirement packages to make sure people stay on their feet. They did this because they learned from the 19th century what happens when you don't.
Part of what makes systemic violence systemic is that it's not always that direct.
In this case it's not just the fact that the useful product takes away jobs, it's also the fact that there is no systemic change to offset how that negatively impacts people.
In Sam Altman's case, if he also advocated for UBI programs or assistance for people who get displaced by his product, he'd be far less problematic.
Or if the society we lived in had better programs for addressing that kind of displacement, the product would be seen less systemically violent.
But apparently that's "socialism" and we can't have any of that, can we? We prefer debt, hunger, and homelessness in this country!
And fyi this is coming from someone who advocates for AI as a technology but criticizes the societal structures that make AI dangerous. Every problem people have with AI as a technology can be traced back to systemic failures in our society. The problem isn't the technology, it's the system.
Sometimes he does:
OpenAI calls for robot taxes, a public wealth fund, and a 4-day workweek to tackle AI disruption.
I don’t think it’s going to satisfy anyone, though, because it’s just talk. Converting that into something real is up to governments, not the AI companies.
There is a very well-funded foundation, though, and they could do… something?
OpenAI accidentally built one of the world’s richest charities. Now what?
…
You know, that's actually really interesting. I'm going to read those and do some more research into that. Thank you.
I want to see if they end up putting their money where their mouth is.
Sometimes there's a middle ground, like virtual adding machine software from 1993 that mimics the electronic adding machines to which accountants were accustomed—machines which were themselves heavily inspired by earlier mechanical adding machines.
I am reminded... (The golem speaks with capital letters)
Killing less than 2.338 people gets the death penalty in some states. And yet we're so much more accommodating of those who kill indirectly if they're making other people a lot of money in the process.
Hell, now you can bet on those deaths on Polymarket at the same time. It's fine!
Of course not. (I recognize that you're positing a devil's advocate argument.) By this logic, if you and I are competing for a job and I get it, that would also be violence. It would be worse, actually, since Altman is much farther removed from you and I'm competing with you directly.
Me getting a job over someone else doesn't mean that other person can't get another job.
Altman's actions can definitly be argued to directly affect you, regardless of if you're an AI engineer, work in a differernt sector entirely, or are simply located in a place next to a data center
No Other Choice (2025)
Sadly what's actual happening is people are instead convincing people that it's the immigrants, wind turbines, trans kids playing sports, bike lanes etc... causing their problems and not the unchecked neoliberalism, then using that to push for even more unregulated capitalism.
God I hope so. Automation is supposed to make everyone's lives better, not just the freaks at the top.
Oh, I'm definitely not advocating for violence either. But I can see why there has been so much anger.
A socialist revival in politics would be the good outcome...
Sadly, yes. I wasn't make Bane posts on other parts of the web from the warehouse fires because I approve. It's because if you read history of this from the guilded age some 100-120 years ago, we're going through the same steps all over again.
A cornered rat strikes back and we don't have the Skynet robot army to stop that as of now
What frustrates me is that replacing people with robots at said jobs should be a good thing. We want to strive for a world where we don't have to work on things we don't need, so we can work on things that matter to us. That's the life-long dream.
In a functioning society, robots doing all the work would mean people being more free, not less.
But in a capitalist society, robots replacing you just pushes you into worse and worse jobs to further help the capitalist class hoard resources, leaving you with even less in the end.
So as it stands, people will just attack the robots and their owners because they know there are no social safety nets otherwise. And I don't blame them, their government has failed them.
I think AI is overall a bad thing. The sheer amount of LLM-generated slop littering Reddit, Instagram, Facebook, YouTube and TikTok these days shows just how spot-on the Dead Internet Theory is.
One example of how prevalent AI has become is when you look at pub advertising on social media. Almost every pub I follow uses AI to generate images in their posts, and while I don't blame them for doing this rather than hire actual graphic artists and professional photographers (labour is expensive and pubs are struggling), it leads to glaring inaccuracies like seeing a post advertising televised darts matches with completely incorrect numbers, or even random letters along the rim of the dartboard. Or... AI-depictions of pro players wearing jerseys that have complete gibberish sponsor logos.
And don't get me started on the blatant Ghiblified cartoons and piss-filters I see on a lot of AI images.
Definitely my experience, although being pushed into a worse job is a hopeful outcome. I was laid off from a commercial reporting job last year and struggled for months to even find anything in accountancy and finance. AI is a very big reason why the job market is in such shambles. If entry level programming jobs are practically nonexistent these days thanks to Anthropic and Claude Code, imagine how it is for any office-based job that doesn't require you to write code...
The most code I've ever written in a professional setting has been the occasional VBA script to automate a tedious task. Otherwise, much of my work involves basic data entry in an accountancy system and using formulas and pivot tables in Excel. Claude could probably do about 90% of my previous job.
I have genuinely questioned whether I should continue studying ACCA or pivot towards a different field entirely. But this would be the second time I've gone into higher/further education and ended up with my academic qualifications becoming worthless. And I don't know what I could even do that is AI-proof. I didn't spend years and lots of money studying degrees and professional qualifications to work in a warehouse or cleaning offices for minimum wage...
We unfortunately do not live in such a society. We need money in order to acquire food, drink, shelter and everything else we need to survive. Cut off the source of money and people end up starving.
UBI is often touted as a solution but I think it's a socialist pipe dream at best. Any nation that implements it will either go bankrupt or pay a pittance that doesn't help anyone. The big question is how will anyone even fund it? We can't even make billionaires and big businesses pay tax at the moment - what the hell makes people think they'll suddenly cough up when unemployment soars?
I think the next few years are going to get incredibly ugly. Either the AI bubble bursts and crashes global stock markets, or entire industries of people get replaced and we end up with mass civil unrest, which could end up being suppressed brutally, especially if AI ends up being implemented in autonomous weaponry.
AI is being used as another tool to squeeze the working class for everything they can get out of us. It itself isn't the problem (at least intrinsically.) It's just another cog in the orphan crushing machine, albeit it a particularly large and annoying one.
Technology in general has always being harnessed by the powerful to subjugate the masses. It's getting terrifyingly efficient at it now though. They can know where you live, when you sleep, what you do, who you talk to, what you eat, your medical records, your purchasing habits, listen in on your conversations, and use any of that information however they want to.
regarding your Dead Internet Theory thought, it has been very sad to see so many communities "polluted" by LLM output text. Used to be, writing coherent longform text was enough of a barrier that most spam was obvious. But now, for example, an Amazon review farm can use LLMs to create dozens of pretty credible, detailed 4 and 5 star reviews. You'll only know when the product turns out to be dogshit.
That's to say nothing of the LLM spam on forums these days, I get pretty annoyed when I realize I'm reading a few bullet points bloated out into multiple paragraphs. Blog posts, too.
It's very sad to see this tragedy of the commons playing out in real time. It honestly makes me want to totally drop the internet some days; I'm here to meet with minds, not lap up slop.
To starve and to be homeless and all that as we cut public assistance. Even job retraining resources get cut drastically. So losing the job and the field makes it feel hopeless, not like there's a great big beautiful hopeful figure out there.
Sam's reaction blog post to this is pretty interesting: https://blog.samaltman.com/2279512
A paragraph explaining how his family is the most important thing in his life.
Then 10x that amount of text for... yet another AI sales pitch? And, in classic tech bro fashion, a Lord of the Rings reference that, much like Peter Thiel's Palantir, feels like either a malicious interpretation of the source text or the ravings of someone who didn't actually read the books:
First: is the AGI in the room with us right now, Sam? How did we go from "attacks on my home make me sad" to "AGI AGI AGI" this fast?
Second: if controlling AGI is the One Ring (if Sam had read the books, he might have learned that there are quite a few Rings of Power, but I digress), is Sam Sauron in his own analogy? And we're supposed to trust Sauron's selfless gift to all of the free peoples of the earth? What could possibly go wrong!
Maybe the scariest thing about this analogy is that it implies that Sam thinks he's as personable as Sauron was before he revealed his true nature. Quite the ego.
On one hand, obviously nobody should be throwing molotovs at anyone else's house. That is a bad thing. On the other hand, a guy who can't separate the safety of his family from a sales pitch and one of the worst LotR analogies I've ever read... is not exactly easy to empathize with.
Even ignoring his sales pitch in there, it's impossible to take him seriously because he offers nothing but platitudes and nothing of substance or specificity. He won't actually stand up for anything. He acts like he understands the frustration people have with him, with the company he works for, with the industry he works within, but he can't actually detail or specify the root causes of the problems or the solutions.
He mentions how democracy needs to stay a democracy and companies shouldn't capture the power, without mentioning or addressing how it's widely believed to already be corrupted, that companies have all the power and democracy is non-existent. What is Sam Altman doing to help? I mean aside from the pitch for OpenAI here, it was also supposed to be a pitch for Sam Altman, the guy who is on 'our' side, that he understands the plight of others who might molotov cocktail his house, but the most he can offer is a line or two about having a family. He needs to only look around at all the people getting crushed by the system to see that having a family doesn't protect anyone, so he might want to find a better pitch than that.
The pump is nearing its end and the dump is coming. The business is still lighting money on fire, and there have been noises of an IPO. Going public is the classic way to find a bunch of bag-holders and jump into a life boat before the ship goes down. (And every media opportunity is a chance to keep the hype alive.)
Meanwhile, datacenters are being put on hold or cancelled from local pushback, energy costs due to the invasion of Iran, hardware shortages, transformer shortages, etc..
You've got growth stalling, ludicrous amounts of debt and no realistic way to ever pay it back. Except a trillion dollar valuation and maybe enough suckers to buy shares. From there, it's either cut-and-run time or we'll end up with another idiot meme stock.
I'm also not surprised the owner of a machine people use to avoid reading, writing and thinking doesn't grasp the basics of a reasonably challenging piece of literature.
Jesus christ that blog post reads like something straight out of r/linkedinlunatics
"✅ what getting a molotov cocktail yeeted towards my house has taught me about b2b sales"
Really, I have to wonder whether he was always like that - or whether constantly talking with an overly agreeable language model makes someone self-centered in that way.
Not joking about that either, delusions are well known by now but that's just an extreme that might be easier to identify.
AI psychosis, right? Or is that more specific?
What they're describing is more of a a Yes Man feeding your narcissism all the time. Psychosis would be the much higher level of disconnect from reality.
Personally, I suspect AI helped him write that post to the first place so it doesn't surprise me that it sounds incredibly corporate and "linked in" cuz that's what it would be matching.
Well I didn't suspect him of setting it up for publicity until that.
He's a company man!
This, alongside the other fires that have been set at warehouses make me wonder how far into the start of revolution we're in.
They're calling the guy who set the "original" fire "Paper Mario" and putting him in memes alongside Luigi.
Also saw someone say "so when I ask my boss for a raise should I reference that made $2 more an hour than me?"
Idk that it is a sign of revolution or a sign of deteriorating social norms and increased violence, or both. The general increase (or perceived) in violence and aggression in customer service, Jan 6, a bunch of things point to us being less and less civil.
The link to Jan 6th is interesting. Intentionally or otherwise, Trump has somewhat normalized (and even legitimized and pardoned!) physical violence against the ruling class. One wonders if that could ultimately be his downfall.
I firmly believe the increase in violence in schools (fights and such) is tied to the normalization of both violent language and threats from the Trump administration over the past 10 years
Paper Mario is a great one haha.
I have no idea either, but seeing more and more of this desperation makes me wonder how much of it will keep happening until something changes.
I don’t get it? Like I get Mario and Luigi are both video game characters from the same game but beyond that… do I need to play the game to get it?
Mario and Luigi are both characters from the same franchise, there are some games in that franchise titled "Paper Mario", and the warehouse that was set on fire was a paper goods facility.
Also people have been "joking" about wanting a "Player Two" or a "Mario" alongside Luigi online for a while. Some less wanting and more expecting it to be inevitable. So when someone else flags as a symbol of class consciousness while allegedly doing crime, the door was already open. The paper part is just convenient.
The other piece of the puzzle that folks haven't mentioned is that people are connecting this guy to Luigi Mangione, the man accused of assassinating a heal insurance CEO
When you say "the same game" which specific one are you thinking of that you should play?
(There are so many I just found this phrasing odd!)
Paper Mario I guess, which I haven’t played. The last Mario game I played was probably super Mario 3. I didn’t know it was a thing.
Gotcha there are multiple paper Mario games which play with dancing between 2 and 3 dimensions and the idea of folding things up into new shapes. One of them was the spiritual successor to super Mario RPG IMO but they've all been fun.
There's no other connection though, it's just a reference
I'm no expert, but I feel like there needs to be some amount of organization in order to be a revolution. Either I'm not cool enough to know who those people are (very possible), or they don't exist (also very possible).
Well it's not a revolution yet if it will be. But I suspect there are folks who would comfortably pivot their organizing to that sort of work.
The Onion: Man Who Threw Molotov Cocktail At Sam Altman’s Home Claims He Was Following ChatGPT Recipe For Risotto
Someone shot at his home last night as well.
It's getting rough out there, y'all.
There's a big gaming-adjacent forum I used to frequent, ResetEra, which always billed itself as a cut above others in terms of being civil and progressive. It was actually formed when the mods and most users of the website NeoGAF jumped ship in protest after the controversial owner got #MeToo'd.
They used to handle the AI discourse about as reasonably as Tildes does, but a couple years ago some extremely anti-tech faction of the mod team successfully lobbied for policies purporting to sharply limit AI discussion, which they promptly started abusing to smother informative or newsworthy threads while letting low-effort ragebait on the topic proliferate. As somebody with very mixed feelings on the technology who nevertheless enjoys talking about its evolution and implications, this was a very frustrating development.
Recently there were multiple threads about violence directed at people associated with the industry, including the now-multiple attacks on Altman's home and gunfire shot at the house of an Indiana politician who backed data center construction. Both were filled with comments glorifying the violence and openly hoping for more. Despite being flagrant violations of site rules, the posts stayed open for days and reports about them were ignored. When I finally called out the moderation failure in the site's official meta thread and asked why they were tolerating people advocating murder, they permabanned me instead.
I don't consider it a great loss, since I'd been spending minimal time there lately due to the mods flouting their own rules like that more and more often. Revisiting any thread more than a year or two old invariably shows a graveyard of normal-seeming commenters whose accounts have since been banned (or self-deleted in protest) for crossing the mods one way or another. But it says something that one of the last big progressive forums with a solidly 30+ progressive audience is being infected by this kind of bloodthirsty rhetoric, and that the people in charge are less inclined to punish people braying for murder than they are the people accusing them of tolerating calls for violence. You kind of expect to see that on social media these days, but part of me hoped that older-school places with alleged standards would hold the line better.
I wonder if the internet is making it harder to organize against those in power because it provides a weak local maximum to our sense of belonging. If these people were talking about AI with their local unions then they’d actually have people to march with.
Time to watch Elysium (2013) again.