After reading the waitbutwhy post, I've been thinking "yes please let me have this", although, I'd probably need some way to verify the hardware / software (for now, privacy isn't 100% dead yet)....
After reading the waitbutwhy post, I've been thinking "yes please let me have this", although, I'd probably need some way to verify the hardware / software (for now, privacy isn't 100% dead yet). But uhh, addressing the post:
...and even free will in a world dominated by super-intelligent machines connected directly to our mind?
I don't believe in free will, but if it turns out that we do have free will due to some kind of quantum randomness or smth, I have a gut feeling that adding a computer to the mix wouldn't change anything.
Wouldn’t a constant connection to a cloud with virtually limitless computing capacity lead to total dependency, to a radical loss of human autonomy, and ultimately, to the total dehumanization of society?
Yes... And no. We're kind of already there in a way, we already have near total dependency, and a lot of the time we're already using computers to control ourselves. That last point is completely nonsense without a definition of humanity, I mean... Is a brain uploaded human "human"? What about Aliens, AGI, etc...
But here’s where this logic goes wrong: Fundamentally messing with who we are as humans...
We do that a lot already, humans are giant pattern matching machines, it doesn't take all that much effort to radically change a human
...redesigning our biology and our chemistry...
Brain implants aren't so much "redesigning" as extending, you'd be adding something to your biology.
and transcending the limitations of our minds and hearts
Ignoring the fact that the definition of "heart" used here (some meta-physical thing) and my definition of "heart" (the object) are different. We have always used things to "transcend" the limitations of our brains and bodies. We invented language and tools as ways to improve ourselves like that, in my opinion brain implants are just a logical next step in that process.
quintessentially human attributes such as giving birth should not be amputated
Why? Also, giving birth isn't a human attribute, it's an attribute that most mammals have, and almost all living things have in some way shape or form.
What I’m worried about the most — and what you need to consider as well — is whether we’ll
even have an actual choice to “opt out” if BCIs are implemented.
Valid concern if one does end up wanting to opt out, it will probably be optional for a while, and then become required. (gut feeling again)
will we be able to find a good job without a BCI?
Maybe. For a while at least yes <assuming BCI is invented in a vacuum and nothing else happens to replace the jobs anyway> technologies don't replace stuff right away, they take time to adopt. In reality "higher skill" jobs will probably require them sooner due to people there being able to actually afford them and that they'd be more useful there.
How far would you go to give your child an edge? Will it be just a matter of principles, or also a financial decision, that will lead to even more inequality?
I honestly have no clue, how far would someone go to give their child an edge? To "not have to worry about the things I had to worry about."?
AI will then work its way into every part of our life and someday we won’t be able to function without it, losing our independence and a lot of what makes us human.
What makes human + AI < human? I can't come up with anything myself, we'd probably become something different but is that really a bad thing? Is there a reason why change is bad? Our brain is effectively a "intelligence thing" (neocortex) on top of a "monkey thing" (limbic system) on top of a "frog thing" (in a nutshell anyway). My fundamental question here is, what's wrong with adding a "super-intelligence thing" on top of that. My frog thing had a blast with my monkey thing, and my monkey thing has seen quite a lot because of my intelligence thing.
Today, more questions are being posed than answered. The reason is that before an age of acceptance, there needs to be an age of discussion to sort out ethical and moral issues. We need to raise questions, issue warnings and keep a close watch on advances in this nascent technology before we agree to a buy-in, and before we lose our right to decide.
Sure, I can agree to that.
And thus concludes the first episode of izik1 hopefully isn't off-topic or inflammatory while talking about something and getting their opinion out there.
I'm pretty much with you on this - your point by point is very similar to what I was thinking as I read the article. It feels like the author is handwaving around "humanity" as a concept and...
I'm pretty much with you on this - your point by point is very similar to what I was thinking as I read the article.
It feels like the author is handwaving around "humanity" as a concept and trying to sound objective when what they really mean is "brain modification is scary and challenges my sense of self". I don't hold the latter against them (although I don't really feel it myself), but I'd prefer a bit more intellectual honesty about how subjective, unknown, and potentially unknowable it is.
If they show any promise I'll be an early(ish) adopter for sure. My main concern is putting something in the brain being a relatively dangerous operation and a BCI isn't useful if you die or are...
If they show any promise I'll be an early(ish) adopter for sure. My main concern is putting something in the brain being a relatively dangerous operation and a BCI isn't useful if you die or are crippled.
I get your point (which is that it crippling you wouldn't be worth it). But I'd imagine that lot of the first few batches of people getting these will likely be crippled (quadriplegic, deaf,...
I get your point (which is that it crippling you wouldn't be worth it). But I'd imagine that lot of the first few batches of people getting these will likely be crippled (quadriplegic, deaf, blind) because they are crippled and the BCI might be able to solve the problem (it's normally a problem with things connecting to the brain, not the brain itself, but in order to make a replacement you need to connect the brain to the replacement, and the replacement is likely a computer)
Yep, as I mentioned in my other comment, my participation in this thread came after a waitbutwhy craze. ^-^ Although now that I think about it, I also went on a biohacking craze right after?...
Yep, as I mentioned in my other comment, my participation in this thread came after a waitbutwhy craze. ^-^
Although now that I think about it, I also went on a biohacking craze right after? before? that.
As someone without any physical disabilities, I would not use a brain implant; however, someone I'm close to has cerebral palsy and I couldn't be more excited for something that might one day heal...
As someone without any physical disabilities, I would not use a brain implant; however, someone I'm close to has cerebral palsy and I couldn't be more excited for something that might one day heal them.
My biggest concerns with brain implants are security and privacy. Any networked device may be hacked, and there isn't a hardware manufacturer today that has resisted privacy invasions such as Prism.
After reading the waitbutwhy post, I've been thinking "yes please let me have this", although, I'd probably need some way to verify the hardware / software (for now, privacy isn't 100% dead yet). But uhh, addressing the post:
I don't believe in free will, but if it turns out that we do have free will due to some kind of quantum randomness or smth, I have a gut feeling that adding a computer to the mix wouldn't change anything.
Yes... And no. We're kind of already there in a way, we already have near total dependency, and a lot of the time we're already using computers to control ourselves. That last point is completely nonsense without a definition of humanity, I mean... Is a brain uploaded human "human"? What about Aliens, AGI, etc...
We do that a lot already, humans are giant pattern matching machines, it doesn't take all that much effort to radically change a human
Brain implants aren't so much "redesigning" as extending, you'd be adding something to your biology.
Ignoring the fact that the definition of "heart" used here (some meta-physical thing) and my definition of "heart" (the object) are different. We have always used things to "transcend" the limitations of our brains and bodies. We invented language and tools as ways to improve ourselves like that, in my opinion brain implants are just a logical next step in that process.
Why? Also, giving birth isn't a human attribute, it's an attribute that most mammals have, and almost all living things have in some way shape or form.
Valid concern if one does end up wanting to opt out, it will probably be optional for a while, and then become required. (gut feeling again)
Maybe. For a while at least yes <assuming BCI is invented in a vacuum and nothing else happens to replace the jobs anyway> technologies don't replace stuff right away, they take time to adopt. In reality "higher skill" jobs will probably require them sooner due to people there being able to actually afford them and that they'd be more useful there.
I honestly have no clue, how far would someone go to give their child an edge? To "not have to worry about the things I had to worry about."?
What makes human + AI < human? I can't come up with anything myself, we'd probably become something different but is that really a bad thing? Is there a reason why change is bad? Our brain is effectively a "intelligence thing" (neocortex) on top of a "monkey thing" (limbic system) on top of a "frog thing" (in a nutshell anyway). My fundamental question here is, what's wrong with adding a "super-intelligence thing" on top of that. My frog thing had a blast with my monkey thing, and my monkey thing has seen quite a lot because of my intelligence thing.
Sure, I can agree to that.
And thus concludes the first episode of izik1 hopefully isn't off-topic or inflammatory while talking about something and getting their opinion out there.
I'm pretty much with you on this - your point by point is very similar to what I was thinking as I read the article.
It feels like the author is handwaving around "humanity" as a concept and trying to sound objective when what they really mean is "brain modification is scary and challenges my sense of self". I don't hold the latter against them (although I don't really feel it myself), but I'd prefer a bit more intellectual honesty about how subjective, unknown, and potentially unknowable it is.
If they show any promise I'll be an early(ish) adopter for sure. My main concern is putting something in the brain being a relatively dangerous operation and a BCI isn't useful if you die or are crippled.
I get your point (which is that it crippling you wouldn't be worth it). But I'd imagine that lot of the first few batches of people getting these will likely be crippled (quadriplegic, deaf, blind) because they are crippled and the BCI might be able to solve the problem (it's normally a problem with things connecting to the brain, not the brain itself, but in order to make a replacement you need to connect the brain to the replacement, and the replacement is likely a computer)
Yep, as I mentioned in my other comment, my participation in this thread came after a waitbutwhy craze. ^-^
Although now that I think about it, I also went on a biohacking craze right after? before? that.
As someone without any physical disabilities, I would not use a brain implant; however, someone I'm close to has cerebral palsy and I couldn't be more excited for something that might one day heal them.
My biggest concerns with brain implants are security and privacy. Any networked device may be hacked, and there isn't a hardware manufacturer today that has resisted privacy invasions such as Prism.