Sure, but is the full human brain the minimum set necessary?
Sentience/sapience is probably an emergent property of a set of neurons needing to coordinate, plan, predict the future and oneself in relation to it.
I suspect that AI is capable of sentience with sufficient complexity and training, but it’s not there yet. I also suspect we’ll be well past the point where it is there before we realize it is, but not until we make some kind of fundamental change in how we do it - we know human level intelligence is possible in the volume and power consumption of, well, a brain so we’re orders of magnitude off of efficiency limits.
It’s estimated that mice have 70 million to 100 million neurons in their brains. They are capable of feeling pain and have social hierarchy. They also experience emotions like fear, pleasure, and anxiety. (We use them in pharmacology models of many mental illnesses.)
Have you ever heard the phrase, “the neurons that fire together, wire together” ? Our neurons are in a constant feedback loop with the environment we experience. Our experiences shape how our neurons make interconnected networks, which then impacts how we behave upon the environment.
If those neurons connected to the computer chip only ever experience playing the game “DOOM,” how would they know about anything else? How could they know about pain without having limbs to innervate and experience the pain with? How could they have a social hierarchy without others to interact with? We may as well be god to those neurons on the PC chip, because we are controlling the entire world they have access to.
What I find sad is that our society is ok with hooking living cells up to a computer to make smarter computers, but has a problem with ethically harvesting stem cells to be used to treat diseases.
It has to be a full fetus with a heartbeat to have rights. /s In all seriousness, the human brain is estimated to have 86 billion neurons.
Sure, but is the full human brain the minimum set necessary?
Sentience/sapience is probably an emergent property of a set of neurons needing to coordinate, plan, predict the future and oneself in relation to it.
I suspect that AI is capable of sentience with sufficient complexity and training, but it’s not there yet. I also suspect we’ll be well past the point where it is there before we realize it is, but not until we make some kind of fundamental change in how we do it - we know human level intelligence is possible in the volume and power consumption of, well, a brain so we’re orders of magnitude off of efficiency limits.
It’s estimated that mice have 70 million to 100 million neurons in their brains. They are capable of feeling pain and have social hierarchy. They also experience emotions like fear, pleasure, and anxiety. (We use them in pharmacology models of many mental illnesses.)
Have you ever heard the phrase, “the neurons that fire together, wire together” ? Our neurons are in a constant feedback loop with the environment we experience. Our experiences shape how our neurons make interconnected networks, which then impacts how we behave upon the environment.
If those neurons connected to the computer chip only ever experience playing the game “DOOM,” how would they know about anything else? How could they know about pain without having limbs to innervate and experience the pain with? How could they have a social hierarchy without others to interact with? We may as well be god to those neurons on the PC chip, because we are controlling the entire world they have access to.
What I find sad is that our society is ok with hooking living cells up to a computer to make smarter computers, but has a problem with ethically harvesting stem cells to be used to treat diseases.
It’s because the stem cells somehow threatened the religious hegemony.