7 votes

Detroit, Westworld, and moving androids beyond human

3 comments

  1. [3]
    balooga
    Link
    I'm not sure what the author's thesis is, apart from "new science fiction builds on concepts explored by earlier science fiction." It's a valid observation, if an obvious one. I think a...

    I'm not sure what the author's thesis is, apart from "new science fiction builds on concepts explored by earlier science fiction." It's a valid observation, if an obvious one.

    I think a distinction should be made here, but isn't, between the contexts where certain philosophical debates are declared to be "settled." Moviegoers may accept that Blade Runner's replicants are essentially human, and be ready to expand on that premise in the sequel. That's reasonable, but it doesn't mean society has closed the book on these matters in the real world.

    I have long enjoyed fiction about robots and AI. From Data and the holographic characters in Star Trek, to Asimov's androids, to Samantha in Her, I can think of countless examples of great stories on the subject. But personally, I am not now — nor will I ever be — willing to consider an artificial entity "alive" or deserving of human rights. I say this as a software engineer, someone with reasonable insight into how these systems operate. I have to put my foot down and declare that just because a machine emulates human behavior does not make it human. Just because I love stories about humanlike machines does not mean I empathize with that perspective in reality.

    I do concede that it's a philosophical question with no objective answer. You have to get into the weeds of determining what it really means to have consciousness, or be alive. It's a deep hole and one that I've spent a lot of time in. Ultimately though I do conclude that the more I learn about computer systems, the less I trust them to be truly intelligent, trustworthy, moral, or humanlike.

    2 votes
    1. [2]
      cfabbro
      Link Parent
      The more I read about neurophilosophy and cognitive neuroscience the more I am convinced of what Sam Harris espouses, that free will is an illusion and everything we do is deterministic. As a...

      But personally, I am not now — nor will I ever be — willing to consider an artificial entity "alive" or deserving of human rights. I say this as a software engineer, someone with reasonable insight into how these systems operate. I have to put my foot down and declare that just because a machine emulates human behavior does not make it human.
      ...
      Ultimately though I do conclude that the more I learn about computer systems, the less I trust them to be truly intelligent, trustworthy, moral, or humanlike.

      The more I read about neurophilosophy and cognitive neuroscience the more I am convinced of what Sam Harris espouses, that free will is an illusion and everything we do is deterministic. As a result, we are in essence fundamentally no different than a computer system running code... it's just that our CPU is made of meat and our code is memetic. So why shouldn't a sufficiently advanced synthetic being capable of fully emulating our behaviors have the same rights as us?

      2 votes
      1. Adys
        Link Parent
        I go with a simpler definition. "Do unto others as you would have them do unto you" If an AI is capable of treating me with respect or disrespect, I ought to treat it the same way.

        I go with a simpler definition. "Do unto others as you would have them do unto you"

        If an AI is capable of treating me with respect or disrespect, I ought to treat it the same way.

        2 votes