14
votes
What are your thoughts on species scale ethics vs individual scale?
For example, 500 people working long hours in dangerous conditions for terrible pay, but they make it possible for 5000 others to live in a utopian society. What about 50 workers and 50,000 benefactors? I think everyone can agree that it's wrong for there to be less benefactors than workers, but what about 50/50? What if it's 500 blue skinned people and a million red skinned?
I usually find myself internally preferring the species level ethical decisions, but I've never been brave enough to admit to it out loud because I know it makes me sound like a socio/psychopath.
It makes for neat solutions to many problems. And it stays that way, for me, until I think about "what if I or someone I care about was one of the 50?" At that point empathy kicks in and I realize that it's cruel to enslave. And so my ethics changes, and it's no longer ethical.
It's impossible to separate the individual from ethical problems. If you believe you can it's not because you actually can; it's that you're ignoring that part. Many societies--historical and contemporary--ignore the individual.
But surely at some point the good has to outweigh the bad? Imagine a world far in the future where every need is met by robots, but one person needs to control them all and his life is miserable for it. Surely his deliverance isn't worth upheaving all of society?
In your scenario, that is better than our world today. One person suffering at the expense of billions is far better than billions suffering at the expense of billions, no doubt about that. Correct me if I'm wrong, but it sounds like you are asking at what point does it become ethical. In my opinion, never.
My happiness is not more valuable than anyone else's, and therefore shouldn't come at the expense of someone else's happiness. However, I recognize that the comforts of my lifestyle likely does come at the expense of others in many aspects that I may not even know.
So it seems you and I have different definitions of what ethical means, which basically boils down to the title of my post, individual or species. What I'm calling Individual ethics is that everybody should be as happy as possible, even if that means only being content and not truly happy, though maybe in the future we can find ways to improve that. While what I'm calling species ethics is to improve the net happiness, so if everybody's happiness is "worth" the same amount then it's a worthy sacrifice a few for the many.
I think that everybody wants to believe that through technology and policy and whatever else, we can make individual ethics as good or better as species ethics, but if we want to be realistic we have to accept that it isn't. And if we do accept that, which many may not and I may just be heartless, then what ratio are we comfortable with?
Just a passing thought, why does anyone need to sacrifice their own happiness? If we have something like your example where through automation or other such means are able to reduce the need for work, why not still share that load as a species?
It's all hypothetical, but the core of the question is basically is it better to share the load and all be mostly happy, or to have many be very happy and few be unhappy. I think in reality it's likely to be a mixture, first world countries benefit from third world countries but will try to help out if it doesn't cost too much, for example. But I think stripping away the real world and dissecting the individual parts is valuable
I would actually say that we have at least a similar understanding of ethics. I tried to address both of those clearly in my comment, but maybe I wasn't as clear as I had thought.
Basically, yes we should work towards better net happiness. One person suffering for every nine people living contently is better than five suffering for every five living contently. But, I think that perspective changes when you get down to an individual level. I will try to clarify my response to your one man controlling all the robots scenario. Why should the burden of society fall on the few, when the many are just as capable of doing the work? Why should one man have to suffer and work his life away while the many benefit from his labor, giving nothing in return? Never offering to take the controls for a day, or creating a system that everyone puts in their fair share of the labor. In this utopian society, instead of one man taking the reigns, why can that not cycle between a few hundred people who work one day at a time, once a year.
I think @demifiend 's quote does a great job of putting into perspective what I am trying to get at for individual ethics. I know that I would not be content benefiting from the suffering of that man in the control room because every time a robot grabbed me a glass of water, it was only possible because of the man slaving away in the control room. Why couldn't I have just gotten up and poured the glass myself? Therefore, I wouldn't be content with myself and my life, knowing my experience was at the expense of that man in the control room.
On a more practical note, having a few hundred people each taking a turn in the operator's room substantially reduces the bus factor. Put the whole operation on one guy's shoulders, and you're in a world of hurt if he gets run over by a bus on his way back from lunch.
Or decides he doesn't want to run the show anymore and sabotages/walks out on everything.
Can you blame him?
Nope. Burn it to the ground, my friend.
That's a good point because it's very very true that the guilt of knowing that you're benefitting from the man will reduce your happiness, which is counter to why he's there in the first place. As for the rotation idea, that's still under the same category as the one man, it's just more gray.
In all of my examples I've given there's been kind of a dramatic idea that the ones suffering are suffering a lot. I do that because I'm interested in breaking down the concept into its parts, and so extremes act as a kind of controlled environment for thought experiments. In reality though, yes it's better to have people share the load, if we stick with the robotic overlord position, then individual would be everybody takes a turn at some point, and species would be these 10 or 100 or 1000 people will cycle through the role.
So for your conscious, would you rather everybody spent a day, or a set group cycle through, but can still be happy when it's not their day. If the latter, would you be comfortable not being one of the people that cycle through? How many people would need to be in the program before you felt OK not being one of them?
Well, I also think that suffering would stem from more than just physical labor. If you spent all day everyday making sure everyone's last desire was fulfilled, while you spent your days in solitude working, that would get to you. You would be the sole outcast, or one of the few. You might view yourself as not worthy of being a member of privileged society. I think the mental suffering from that would be inevitable and quite intense. Whereas, if you had others offering their time to help relieve you from the work you would view it as a fair society that is inclusive of everyone.
Interesting question. I would probably say a voluntary system would be best. People that want to spend a few days of their time contributing and giving back. Sure, there would be a large portion that would never work a day in their lives, but in these scenarios, that's unavoidable. Point is, to ensure individual scale happiness for everyone, I think forced labor would have to be eliminated altogether.
"Your voluntary labor application has been rejected as a result of all voluntary roles already being taken."
That made me laugh pretty hard. I can't think of anything more to add to this discussion though, so thanks for your time. I enjoyed this.
Agreed, I had never given the topic any thought. This was a good thought experiment. Glad you liked my joke. :)
What you're grasping for is called utilitarianism: defined as "the greatest good of the greatest number" by its originator in western thinking, Jeremy Bentham. In this philosophy, utility is the nett benefit of an action, after considering the happiness and the misery it will cause. So, we weigh up the unhappiness caused to those 500 workers and see if it is more than or less than the happiness caused to the 5,000 non-workers, to determine whether the nett outcome is an increase of human happiness or not.
Interesting, I knew there had to be an official school of thought about this but never bothered to go find it.
Ursula K. Le Guin's "The Ones Who Walk Away from Omelas" is relevant.
Extra credits explores this idea in their analysis of Prey: https://youtu.be/Z3QsCy4ekWk
The whole game of Prey is centered on this concept.
The thing about hypotheticals like these is that they don't tend to make much sense as either-or decisions outside of the realm of thought experiments. For instance, why can't the 5000 benefactors merge labor pools with the 500 workers and split up the work so that everyone only has to work about 9% as hard as the workers? Or if that's not possible for whatever reason, just rotate the portion of the total population engaged in that kind of work so that nobody's doing it more than a few months at a time.
Now it's possible that for social reasons, they don't do that, but in light of the existence of these options, you'd be pretty safe in calling any society that did use that sort of caste based labor system deeply unethical, regardless of how few people were employed in hard labor.