Here is the text. I don't think the article offers much insight into the statement, and you're better off just reading it for yourself. The most common theme here is moral agency. The authors say...
Here is the text. I don't think the article offers much insight into the statement, and you're better off just reading it for yourself.
The most common theme here is moral agency. The authors say that humans are the only moral agents in the universe. With the "weak" AI we have today I agree, but at some point we'll create an AI as complex and capable as a human mind. I don't believe there's any difference in the validity of an opinion reached by inorganic thought. Until we can concretely define consciousness it's not possible to say a device that accurately mimics a human brain lacks it, and I'd argue that any conscious sentient being has at least some moral agency.
We affirm the goodness of God’s design for human sexuality which prescribes the sexual union to be an exclusive relationship between a man and a woman in the lifelong covenant of marriage.
This feel out of place among the talk of moral agency, data privacy, and technology. They just had to shoe-horn this in here.
The use of AI may mitigate the loss of human life, provide greater protection of non-combatants, and inform better policymaking.
I'm completely opposed to war, but I've also thought the same thing. For the brief period where only a few nations control AGI capable of pulling off assassinations there may actually be a period of extremely limited casualties. Of course, that period won't last long. After that it will be an absolute shithow.
Here is the text. I don't think the article offers much insight into the statement, and you're better off just reading it for yourself.
The most common theme here is moral agency. The authors say that humans are the only moral agents in the universe. With the "weak" AI we have today I agree, but at some point we'll create an AI as complex and capable as a human mind. I don't believe there's any difference in the validity of an opinion reached by inorganic thought. Until we can concretely define consciousness it's not possible to say a device that accurately mimics a human brain lacks it, and I'd argue that any conscious sentient being has at least some moral agency.
This feel out of place among the talk of moral agency, data privacy, and technology. They just had to shoe-horn this in here.
I'm completely opposed to war, but I've also thought the same thing. For the brief period where only a few nations control AGI capable of pulling off assassinations there may actually be a period of extremely limited casualties. Of course, that period won't last long. After that it will be an absolute shithow.
What, like a Southern Baptist is going to focus on one task for more than 15 minutes without thinking about gay sex?
BRB, I'm gonna go create a gay artificial intelligence out of spite.
You're right. But I like to pretend people aren't so predictable. Maybe it's respect for their ability to change, maybe it's my own naivete.