17 votes

The Trolley Problem

An interesting thought experiment that I vividly remember from undergrad philosophy courses is the trolley problem:

You see a runaway trolley moving toward five tied-up (or otherwise incapacitated) people lying on the main track. You are standing next to a lever that controls a switch. If you pull the lever, the trolley will be redirected onto a side track, and the five people on the main track will be saved. However, there is a single person lying on the side track. You have two options:

  1. Do nothing and allow the trolley to kill the five people on the main track.
  2. Pull the lever, diverting the trolley onto the side track where it will kill one person.

A variation of the problem that we were also presented with was:

You see a runaway trolley moving toward five tied-up (or otherwise incapacitated) people lying on the main track. You are standing on a bridge that runs across the trolley tracks. There is a large man on the bridge next to you, who if pushed over the bridge and onto the track, would safely stop the trolley, saving the five people but killing the large man. Do you:

  1. Push the man over the bridge, saving the five people.
  2. Allow the trolley to kill the five people

Which is the more ethical options? Or, more simply: What is the right thing to do?

52 comments

  1. [20]
    moocow1452
    Link
    The Good Place probably has the best contemporary exploration* of this issue. *And the best solution.

    The Good Place probably has the best contemporary exploration* of this issue.

    *And the best solution.

    20 votes
    1. jgb
      Link Parent
      I kinda wish that the suited man's 'solution' was multi-track drifting.

      I kinda wish that the suited man's 'solution' was multi-track drifting.

      5 votes
    2. [2]
      Adys
      Link Parent
      VSauce made a Mind Field episode about it called The Greater Good. I really liked it. Also I highly recommend watching Mind Field, it's great.

      VSauce made a Mind Field episode about it called The Greater Good. I really liked it.

      Also I highly recommend watching Mind Field, it's great.

      5 votes
      1. 0d_billie
        Link Parent
        Another vote for Mind Field - it's the only YouTube premium I've been compelled to watch (the rest looks like garbage). It's a shame Michael isn't still putting out his regular content any more,...

        Another vote for Mind Field - it's the only YouTube premium I've been compelled to watch (the rest looks like garbage). It's a shame Michael isn't still putting out his regular content any more, but on the other hand the production values of Mind Field lead to such an amazing end product that I will happily wait for a new season :)

        3 votes
    3. [2]
      skybrian
      Link Parent
      I watched the first three episodes of the Good Place and found it simplistic. It seems like if the main character were a halfway decent lawyer, they could get what they want very easily just by...

      I watched the first three episodes of the Good Place and found it simplistic. It seems like if the main character were a halfway decent lawyer, they could get what they want very easily just by asking nicely? Does it get better?

      3 votes
      1. moocow1452
        Link Parent
        Yeah, it's a little clunky into Season 1, but it picks up halfway through and really takes off in Season 2.

        Yeah, it's a little clunky into Season 1, but it picks up halfway through and really takes off in Season 2.

        7 votes
    4. [14]
      JakeTheDog
      Link Parent
      I love it! Though I would ague that this is a very real problem, and that is not a good solution, now that we have automated machines e.g. self-driving cars.

      I love it!

      Though I would ague that this is a very real problem, and that is not a good solution, now that we have automated machines e.g. self-driving cars.

      2 votes
      1. [13]
        skybrian
        Link Parent
        I'd be somewhat surprised if it came up for real with driverless cars. I think they will just slow down at the first sign of trouble and avoid being surprised in a situation where they can't stop...

        I'd be somewhat surprised if it came up for real with driverless cars. I think they will just slow down at the first sign of trouble and avoid being surprised in a situation where they can't stop safely.

        4 votes
        1. [12]
          JakeTheDog
          Link Parent
          It's actually a hot topic among self-driving car engineers, and has been for a while. It's not just about the present moment, but the very near future.

          It's actually a hot topic among self-driving car engineers, and has been for a while. It's not just about the present moment, but the very near future.

          4 votes
          1. [4]
            skybrian
            Link Parent
            Well, the subhead says "But developers say the moral quandary just isn’t very helpful" and the article goes into detail why. But interpret that how you will.

            Well, the subhead says "But developers say the moral quandary just isn’t very helpful" and the article goes into detail why. But interpret that how you will.

            5 votes
            1. [3]
              JakeTheDog
              Link Parent
              Oh of course not, it's an ancient moral quandry for a reason. The point is that the time has come where we need explicit answers/solutions. It's no longer avoidable / a mere thought experiment.

              Oh of course not, it's an ancient moral quandry for a reason. The point is that the time has come where we need explicit answers/solutions. It's no longer avoidable / a mere thought experiment.

              1. [2]
                vektor
                Link Parent
                Quite frankly, No. We don't need an explicit solution because humans have none either. If a human were in such a situation, we would just fumble around and do whatever. No reason whatsoever put...

                Quite frankly, No. We don't need an explicit solution because humans have none either. If a human were in such a situation, we would just fumble around and do whatever. No reason whatsoever put into the decision. A machine could flip a coin and be on equal footing with humans.

                But no, we demand excellence, even though self driving cars are safer than humans already. Perfect is the enemy of good enough.

                3 votes
                1. JakeTheDog
                  Link Parent
                  Except 1) computers do not fumble around like humans, they have explicit instructions to make specific decisions on timescales millions of times faster than humans and 2) your statement will not...

                  Except 1) computers do not fumble around like humans, they have explicit instructions to make specific decisions on timescales millions of times faster than humans and 2) your statement will not hold up in a court of law when automaker is being asked why their car made one decision over the other.

                  Quite frankly, states are already making legislative steps towards how the trolley problem should be approach such as making bias based on age, gender etc. impermissible

                  1 vote
          2. [7]
            NaraVara
            Link Parent
            Why? The "trolley problem" itself isn't meant to be interesting on its own, it's just a discursive vehicle to help you understand how different ethical frameworks advise actions. It doesn't...

            Why? The "trolley problem" itself isn't meant to be interesting on its own, it's just a discursive vehicle to help you understand how different ethical frameworks advise actions. It doesn't actually tell you anything about the ethical implications of any decision you're making, it just helps you clarify the real-world application of whatever ethical framework you're formulating.

            1. [6]
              JakeTheDog
              Link Parent
              Exactly.

              it just helps you clarify the real-world application of whatever ethical framework you're formulating.

              Exactly.

              1. [5]
                NaraVara
                Link Parent
                But if you're designing a car-driving brain then you already have the real-world application in front of you. What additional benefit does the trolley problem give you that the task you're...

                But if you're designing a car-driving brain then you already have the real-world application in front of you. What additional benefit does the trolley problem give you that the task you're undertaking doesn't give you a perfect example of already?

                Moreover, AI designers shouldn't be formulating fresh ethical frameworks to inform their designs and the idea that they would be is quite troubling. The ethics already exist and the relevance to design decisions should be left to the actual ethicists. I don't want a bunch of engineers fumbling around with things this important.

                1 vote
                1. [4]
                  JakeTheDog
                  Link Parent
                  Now I'm confused as to what you're getting at. The trolley problem is just a generalization of real world problems. It's a tool. That's the problem, they only recently have begun to exist in the...

                  But if you're designing a car-driving brain then you already have the real-world application in front of you. What additional benefit does the trolley problem give you that the task you're undertaking doesn't give you a perfect example of already?

                  Now I'm confused as to what you're getting at. The trolley problem is just a generalization of real world problems. It's a tool.

                  The ethics already exist and the relevance to design decisions should be left to the actual ethicists. I don't want a bunch of engineers fumbling around with things this important.

                  That's the problem, they only recently have begun to exist in the form of legislature. Right now, the ethical handbook is quite empty, namely because there are no obvious solutions and the solutions are culture-dependent.

                  1. [3]
                    NaraVara
                    Link Parent
                    It’s an analogy to help think through the logic of ethical frameworks. It adds zero value to designing or developing machine logic around driving. The ethical handbook is quite full, most people...

                    Now I'm confused as to what you're getting at. The trolley problem is just a generalization of real world problems. It's a tool.

                    It’s an analogy to help think through the logic of ethical frameworks. It adds zero value to designing or developing machine logic around driving.

                    Right now, the ethical handbook is quite empty, namely because there are no obvious solutions and the solutions are culture-dependent.

                    The ethical handbook is quite full, most people just don’t read it.

                    No there aren’t obvious solutions, and the fact is there never will be. People need to get comfortable with that uncertainty and learn how the decision making process and limits get set.

                    1. [2]
                      JakeTheDog
                      Link Parent
                      Except it's not just an abstract analogy. There are and will be real situations where a self-driving car must decide who to kill/rescue. The trolley problem exists in real life. It's not a mere...

                      It’s an analogy to help think through the logic of ethical frameworks. It adds zero value to designing or developing machine logic around driving

                      Except it's not just an abstract analogy. There are and will be real situations where a self-driving car must decide who to kill/rescue. The trolley problem exists in real life. It's not a mere thought experiment.

                      The ethical handbook is quite full, most people just don’t read it.

                      How is it full? In this specific case? Where are the ethics surrounding this problem? Mind you, again, ethics are not universal. Your book is not the same as others.

                      and the fact is there never will be

                      I disagree. I'm not a moral relativist. I think there will be solutions that make us look like barbarians in hindsight (just as we look upon the past).

                      1 vote
                      1. NaraVara
                        Link Parent
                        The trolley problem is a thought experiment because it posits an extremely specific and controlled context and assumes infinite time to mull over alternatives. The problem they’re dealing with in...

                        The trolley problem exists in real life. It's not a mere thought experiment.

                        The trolley problem is a thought experiment because it posits an extremely specific and controlled context and assumes infinite time to mull over alternatives. The problem they’re dealing with in real life is more detailed and complex, so the trolley problem itself is useless to them.

                        How is it full? In this specific case? Where are the ethics surrounding this problem? Mind you, again, ethics are not universal. Your book is not the same as others.

                        Ethics is an entire formal field of study within philosophy. There is an entire library worth of things to read. Yes, it is quite full.

                        I disagree. I'm not a moral relativist. I think there will be solutions that make us look like barbarians in hindsight (just as we look upon the past).

                        You just now said ethics aren’t universal and are culturally dependent. How do you reconcile these statements?

                        And of course there aren’t precise answers. Do you think there are mathematical proofs for any of this?

                        1 vote
  2. [4]
    Thunder-ten-tronckh
    Link
    Fucked up scenarios beget fucked up decisions. In the two that you described, the decision-maker is always complicit in someone's death by nature of observing the possible outcomes. The decision...

    Fucked up scenarios beget fucked up decisions. In the two that you described, the decision-maker is always complicit in someone's death by nature of observing the possible outcomes. The decision not to decide is still a decision and all that good stuff.

    In scenario 1, I pull the lever on the rationale that I had no hand in placing anybody on the track, so my choice to save 5 lives does not come at the expense of endangering someone external to the situation. Basically, 1) it's not my fault this scenario exists, 2) I am complicit in the deaths of others regardless of my choice, 3) the only thing my choice directly affects is the direction of the trolley, therefore 4) I alter the path so that pain and suffering are minimized.

    In scenario 2, I say "shit happens" and do nothing on the basis that I won't harm someone external to the situation. Similar thought process: 1) it's not my fault that this scenario exists, 2) I am complicit in the deaths of other regardless of my choice, but 3) the opportunity to save 5 lives requires action that kills someone external to the problem, therefore 4) I choose to do nothing on the basis that I won't create new suffering to undo the suffering of others.

    8 votes
    1. [3]
      mike10010100
      Link Parent
      Right, but generalizing (2) as "I'm complicit in the deaths of others regardless" kind of removes the fact that in one case you're complicit in the death of 5, and in the other you're complicit in...

      Right, but generalizing (2) as "I'm complicit in the deaths of others regardless" kind of removes the fact that in one case you're complicit in the death of 5, and in the other you're complicit in the death of 1.

      3 votes
      1. [2]
        Thunder-ten-tronckh
        Link Parent
        It wasn't my intention to minimize that. That's exactly why I chose to pull the lever in scenario 1, but when weighed against the idea of being complicit in the death of someone who isn't already...

        It wasn't my intention to minimize that. That's exactly why I chose to pull the lever in scenario 1, but when weighed against the idea of being complicit in the death of someone who isn't already tied to the tracks, I couldn't justify pushing the man in scenario 2.

        4 votes
        1. ChuckS
          Link Parent
          The way I keep thinking about this is that there are expected behaviors for vehicles on the road (this is all about autonomous vehicles). Say a family is together and mom pushes the baby stroller...

          The way I keep thinking about this is that there are expected behaviors for vehicles on the road (this is all about autonomous vehicles).

          Say a family is together and mom pushes the baby stroller into the street without looking, while dad saw the car coming and didn't step into the street. Same scenario now - does the car keep driving in the street or does it veer onto the sidewalk? Kill mother and child or kill the father?

          I would argue here that everyone is going to expect that the car is going to continue driving on the street. If the father or mother sees the car coming, their reaction would be to get back onto the sidewalk. If the decision is made to put the car on the sidewalk and the mother leaps back, pulling the stroller back, then what would have been a near miss becomes a dead family of 3.

          The terrible reality is that some accidents are unavoidable. Some kid is going to run out from between two parked cars and it'll be too late to stop. Going to happen. Every parent would try to pull the child back. The best reaction is to ensure the event is being recorded, apply the brakes, and hope they get out of the way in time.

          The thing I'm most excited about actually is that self-driving cars should actually be complying with the posted speed limits, which should actually mean traffic at 25 mph in residential areas. People usually go 30+ where I live now, and my childhood home was on a street that shortcutted the town stoplights and people would routinely do 40+ on that street.

          3 votes
  3. [2]
    Deimos
    Link
    A team from MIT made a site largely based around the trolley problem a few years ago, named "Moral Machine": http://moralmachine.mit.edu/ It can be interesting to try out, and they've done some...

    A team from MIT made a site largely based around the trolley problem a few years ago, named "Moral Machine": http://moralmachine.mit.edu/

    It can be interesting to try out, and they've done some analyses of the data from it showing some things like differences between choices for people from different cultural backgrounds. This looks like a pretty good article about it: Inside the Moral Machine

    8 votes
    1. krg
      (edited )
      Link Parent
      Cool application. I answered a few without reading the description and didn't realize the walk signal was part of it. Here are my results, anyhow. Even though the people you run into are...

      Cool application. I answered a few without reading the description and didn't realize the walk signal was part of it. Here are my results, anyhow.

      Even though the people you run into are guaranteed to die, I still tried to place myself in the driver's seat and somewhat rationalize my choice. For example, my results show a preference for saving women. Well, that's because I believe men are generally heartier and would have a higher chance of surviving a collision with a car than a woman (and seems to be the case.) Again, doesn't matter in this application as death is guaranteed...but something that could possibly flash in my mind if I had to make a choice.

      1 vote
  4. [2]
    rkcr
    Link
    I don't have much to say about the actual trolley problem itself, but rather I get a huge kick out of all the trolley problem jokes. Also related: how a kid solves the trolley problem. There are a...

    I don't have much to say about the actual trolley problem itself, but rather I get a huge kick out of all the trolley problem jokes. Also related: how a kid solves the trolley problem.

    There are a few albums you can find chock full of trolley problem jokes, but I won't link them because there are a few extremely problematic "jokes" contained therein. 90% of them are hilarious to me though.

    5 votes
    1. mrbig
      Link Parent
      Could you send me the links to the problematic ones? I promise I’ll use them responsibly.

      Could you send me the links to the problematic ones? I promise I’ll use them responsibly.

  5. [2]
    Douglas
    Link
    Logically, pushing the person over the bridge and flipping the switch have the same outcome, so if I were to choose one I should also do the other-- but then there's that pesky part of the brain...

    Logically, pushing the person over the bridge and flipping the switch have the same outcome, so if I were to choose one I should also do the other-- but then there's that pesky part of the brain that has a harder time afflicting pain on someone the closer they are to you in proximity.

    I've often thought that flipping the switch or pushing a person over the bridge makes you responsible for killing that person, and equally makes you responsible for saving the others, and is the right decision to make.

    And then I think about how simplified the problem is by using a literal switch, and how switches of all sorts are flipped every day by people in power; gutting healthcare, welfare, or lifelines that directly impact the well-being of individuals to the extent that removing it would kill that person. Not acting enough on climate change and issues that affect everyone on this planet feels like a very big switch not to flip. Are those politicians not murderers themselves to some degree?

    So I'm not about to beat myself up for killing one person in exchange for saving the lives of others, because it's better to've done it to save people with the knowledge I had in that moment than to've not done anything at all. And because far worse choices have been made by far worse people for far worse reasons than whatever I did in that moment.

    ...that's the neat little bow I'm sure my brain would quickly bury over the memory and guilt that would undoubtedly plague me the rest of my life as I think of the frightened look of the person's eyes when they asked me not to kill them, please they have a child they'd say, but I'm sorry, I've worked this all out long before it happened, and the sacrifice must be made.

    4 votes
    1. Archimedes
      Link Parent
      The part I can't get over is that pushing someone over the bridge isn't very easy to accomplish or likely to actually do anything to help in reality so the thought experiment just breaks down as...

      The part I can't get over is that pushing someone over the bridge isn't very easy to accomplish or likely to actually do anything to help in reality so the thought experiment just breaks down as it becomes a complex series of probabilities rather than a binary choice.

      1 vote
  6. [2]
    balooga
    Link
    Obviously there are merits and demerits to both choices, but after wrestling with this I think I land on the side of "don't pull the lever." For me it boils down to the culpability of action vs....

    Obviously there are merits and demerits to both choices, but after wrestling with this I think I land on the side of "don't pull the lever."

    For me it boils down to the culpability of action vs. inaction.

    If I take action, I'm directly responsible for the consequences of that action—in this case, the death of one innocent person who otherwise would have lived. Their blood is on my hands, and my conscience. I actively killed them. I could, justifiably, be convicted for murder.

    On the other hand. If I do nothing, I'm not responsible for events that were already in motion before I arrived on the scene, which would've played out exactly the same if I weren't present. It's true that more people have died, but the blame for that is on the shoulders of whoever tied people to the track and sabotaged the trolley brakes in the first place. More importantly, my actions have not directly caused harm to anyone, and I bear no responsibility in the matter.

    It's not perfect, but I think inaction is preferable here. I don't go to jail, and I can sleep a bit better at night because I'm able to externalize the event as something tragic I merely witnessed rather than participated in. Hopefully the real perpetrator bears the full weight of guilt and consequence, as they should.

    (Worth pointing out... If the situation were different, so I could save everyone without actively harming another, I'd totally feel a moral obligation to intervene. But that wouldn't be the trolley problem, would it?)

    3 votes
    1. [2]
      Comment deleted by author
      Link Parent
      1. balooga
        Link Parent
        In general, I don't support legislating morality. Having a moral duty to act is quite different from a legal compulsion to act. Laws should stop people from harming others, but they shouldn't be...

        In general, I don't support legislating morality. Having a moral duty to act is quite different from a legal compulsion to act. Laws should stop people from harming others, but they shouldn't be used to force them to do something they wouldn't otherwise do. In my view that's an overreach and a violation of personal liberty.

        The Wikipedia article you linked, under the Common Law heading, mentions a few general situations where a reasonable duty to rescue could arise, and I agree with those.

        1 vote
  7. krg
    Link
    So, this is essentially a moral Rorschach test? I pretty much agree with points @balooga made. Hmm...how would one up the stakes, though... What if pulling there were no other person on the other...

    So, this is essentially a moral Rorschach test?

    I pretty much agree with points @balooga made.

    Hmm...how would one up the stakes, though...

    What if pulling there were no other person on the other track but pulling the lever would kill a random person elsewhere in the world? What if the trolley was in an unpopulated place (besides the passengers and you by the lever) but was carrying a weapon of mass destruction towards a populated city center and pulling the lever would cause it to detonate killing you and the passengers but saving countless others? In that case, you wouldn't have to live with your decision... Umm.. what if there were three tracks, one with five people, a track with one, and a track with no one and you can pull the lever once and it randomly selects a track...what if four tracks with only two populated...what if five tracks with only two... at what point do you play the odds?

    Well, hypothetical situations are just that, I suppose. Who the hell knows how I'd react if I were faced with similar situations. We can look back at the historical record, though, and find out how some people reacted in sacrifice-one-for-many type situations, though. "United Airlines Flight 93" is one that comes to mind... Not just the passenger's decision, but also Dick Cheney's decision, as he authorized it to be shot down.

  8. [19]
    JeanBaptisteDuToitIV
    Link
    Doesn't matter. Human life has no objective value. 0(5) = 0(1)

    Doesn't matter. Human life has no objective value. 0(5) = 0(1)

    2 votes
    1. [13]
      balooga
      Link Parent
      There's no such thing as "objective value" so that's a pretty odd metric by which to judge one's actions or moral philosophy.

      There's no such thing as "objective value" so that's a pretty odd metric by which to judge one's actions or moral philosophy.

      8 votes
      1. [12]
        JeanBaptisteDuToitIV
        Link Parent
        I agree. Therefore neither of the two choices can be objectively correct.

        There's no such thing as "objective value"

        I agree. Therefore neither of the two choices can be objectively correct.

        1 vote
        1. [5]
          Comment deleted by author
          Link Parent
          1. [4]
            JeanBaptisteDuToitIV
            Link Parent
            If there are no objective answers than how can either of the choices be more correct or ethical than the other?

            If there are no objective answers than how can either of the choices be more correct or ethical than the other?

            1. [4]
              Comment deleted by author
              Link Parent
              1. [3]
                JeanBaptisteDuToitIV
                Link Parent
                So I could choose to let five people die because death is funny and one ought to cause as much suffering as possible, and be just as correct as choosing to flip the switch for the opposite...

                So I could choose to let five people die because death is funny and one ought to cause as much suffering as possible, and be just as correct as choosing to flip the switch for the opposite reasons, or choosing to not flip the switch to avoid becoming complicit in anothers suffering, correct?

                1 vote
                1. CALICO
                  Link Parent
                  Essentially, there's no right or correct answer. In isolation, ones personal answer to such a philosophical question informs about oneself. Out of isolation, differing answers or the same answer...

                  Essentially, there's no right or correct answer.

                  In isolation, ones personal answer to such a philosophical question informs about oneself.

                  Out of isolation, differing answers or the same answer for different reasons inspires conversation of values, responsibility, and such. Those conversations can lead to deeper introspection, maybe highlight an oversight in reasoning, cause changes of opinion, reinforce opinions, etc.

                  In the aggregate, how most people might answer is part of what drives the kind of society we create or perpetuate. If everyone held your hypothetical view, society would look different than if everyone made the choice to kill one to save five.
                  Similarly, outside of the number of total deaths there's people who have issue with engaging in the problem at all. Abstaining from acting for the purpose of keeping blood off their hands. Passivity over activity, regardless of outcome. Those people are no more or less correct as anyone else. However whether a majority of folks choose to be passive or active shift how society functions.

                  Any reason behind any choice matters in this kind of context, because if enough people thought the same then it would be reflected in society.

                  That's part of the point in ethics, morality, and philosophy. Understanding who we are as individuals, in groups, and whether it would be beneficial if everyone were to hold the same opinion.

                  What beneficial means is another conversation on its own.

                  5 votes
                2. balooga
                  Link Parent
                  Seems like you're really hung up on the idea of an absolute, objective correctness. There's no right answer here, as you've indicated. It's all subjective. That said, we're all subjectively...

                  Seems like you're really hung up on the idea of an absolute, objective correctness. There's no right answer here, as you've indicated. It's all subjective.

                  That said, we're all subjectively calculating the morality of the situation too. If "one ought to cause as much suffering as possible" is the conclusion you've personally come to, we're all going to subjectively categorize you as a psychopath. And that consensus is the closest thing to an objective right or wrong that you'll find in this thread.

        2. [6]
          Thunder-ten-tronckh
          Link Parent
          Right. That's why the question is about what decision you'd make.

          Right. That's why the question is about what decision you'd make.

          3 votes
          1. [5]
            JeanBaptisteDuToitIV
            Link Parent
            The question was 'What is the right thing to do?' I'm saying that neither choice can be the (bold italics) right thing to do as neither can be objectively more moral than the other.

            The question was 'What is the right thing to do?' I'm saying that neither choice can be the (bold italics) right thing to do as neither can be objectively more moral than the other.

            1 vote
            1. [3]
              balooga
              Link Parent
              When someone asks you a question about the "rightness" of an ambiguous moral choice, they are expecting you to provide your personal, subjective opinion. That's the whole point of this sort of...

              When someone asks you a question about the "rightness" of an ambiguous moral choice, they are expecting you to provide your personal, subjective opinion. That's the whole point of this sort of question.

              3 votes
              1. [2]
                JeanBaptisteDuToitIV
                Link Parent
                But it doesn't make sense to form a subjective opinion contrary to an objective truth, which is what you have to do to answer the question of which choice is correct, as we have already...

                But it doesn't make sense to form a subjective opinion contrary to an objective truth, which is what you have to do to answer the question of which choice is correct, as we have already established that objectively neither is correct. You don't form an opinion as to whether 1 + 1 = 2, because it is objectively true. You don't form an opinion as to whether an action is right or wrong, because objectively it is neither. That's why the only rational opinion to take is that it doesn't matter.

                1 vote
                1. balooga
                  Link Parent
                  Then what's the point of opinion at all? Can a person prefer the color blue over the color yellow? Pistachio ice cream over rocky road? Objectively neither of those is better than the other. But...

                  Then what's the point of opinion at all? Can a person prefer the color blue over the color yellow? Pistachio ice cream over rocky road? Objectively neither of those is better than the other. But it's okay to decide which you'd rather have, and even list some reasons why you like it.

                  Also, subjective value is real. There's a reason why nobody sells anchovy ice cream, because there wouldn't be enough demand for it. Such a flavor might be viable as a food, but it's effectively "wrong" according to the subjective valuation of the market. For all intents and purposes, it's incorrect (albeit not in the objective sense, which doesn't exist).

                  Same with ethics. If a population agrees it's wrong to kill indiscriminately, and one person disagrees, the consensus wins out. Maybe they're not objectively correct, but they're de facto correct. And that's close enough to an objective truth to organize society around.

                  4 votes
            2. Thunder-ten-tronckh
              Link Parent
              balooga explained my thoughts perfectly. If you didn't make that assumption, I totally get why. But that's the assumption I think many of us are working from here when entertaining the question.

              balooga explained my thoughts perfectly. If you didn't make that assumption, I totally get why. But that's the assumption I think many of us are working from here when entertaining the question.

        3. mike10010100
          Link Parent
          Nobody said they had to be objectively correct. The question was about which is the more ethical option.

          Nobody said they had to be objectively correct. The question was about which is the more ethical option.

          2 votes
    2. mrbig
      Link Parent
      The trolley problem clearly works under the assumption that human life has objective value. Otherwise, it would not be worthy of our time to think about it.

      The trolley problem clearly works under the assumption that human life has objective value. Otherwise, it would not be worthy of our time to think about it.

    3. [4]
      culturedleftfoot
      Link Parent
      Okay... does that mean you're in the do nothing camp then?

      Okay... does that mean you're in the do nothing camp then?

      1. [3]
        JeanBaptisteDuToitIV
        Link Parent
        Nah I'd flip the switch.

        Nah I'd flip the switch.

        1. [2]
          culturedleftfoot
          Link Parent
          Why?

          Why?

          1. JeanBaptisteDuToitIV
            (edited )
            Link Parent
            Actually it kind of depends. If someone close to me was on one track I would guide the train to the other, no matter how many it would kill. With total strangers I would always flip the switch. In...

            Actually it kind of depends. If someone close to me was on one track I would guide the train to the other, no matter how many it would kill. With total strangers I would always flip the switch. In the second scenario I wouldn't push the man over the bridge because it would make me feel worse than letting 5 strangers die, unless it would save someone I value. I guess subjectively, 5 strangers are more valuable to me than one, but less valuable than a friend/family member, and in the second scenario, are also less valuable than myself (because if I were to save them, I myself would be harmed), unless, again, one of them is a friend/family member. Basically it's a matter of who I value the most; myself, others, or the common good. Since none of the decisions are objectively right, I don't have to be consistent in who benefits from my actions.