Indeed. Building an AI that matches human intelligence using equal or less mass than a human brain requires one or both of two things to be true: 1. The computational mass efficiency of brain tissue is very far from optimum. Considering the amount of time evolution has been improving upon it, I highly doubt that is true. 2. Most of the brain's computation is not involved in cognition. That may be true. We don't really know.
There are hard limits. No matter how you try you can't perfectly simulate three atoms using two atoms. If it turns out we have to, in software, represent fifty percent of neuronal activity to create consciousness we're in real trouble. A dragonfly can take inputs from thousands of ommatidia and use them to track targets in space using only sixteen neurons. How many transistors would it take us to do the same? Take that ratio and apply it to the 86 billion neurons in the human brain and you have a rough idea of what it will take to create strong AI. The numbers aren't promising.
Brain tissue needs to optimize for a lot of things other than computational efficiency. It needs to stay operational for decades with minimal replacement of parts and it needs to be resilient to a fair amount of bumps, diseases and chemical injury. Silicon chips don't have to be build to survive these conditions so it's possible they can be much more efficient at the computational aspect.
> 2. Most of the brain's computation is not involved in cognition. That may be true. We don't really know
I think that's largely known, depending of course on how you define "cognition".
Huge tracts (I don't have numbers) of the cortex are dedicated to things like vision, motor control, etc... Those aren't "cognition" as generally understood, and there are many stroke victims out there who can testify (like, actually "testify", in the sense of using their brain to explain it to you) to the fact that they can no longer see, or move their left side, etc... Their "cognition" is not impaired.
It gets fuzzier with things like speech and recognition, which also have dedicated real estate but are, kinda, "para-cognition" tasks.
Really, yes: you can have a "thinking" engine with a tiny fraction of the computation power of the human brain. I think most folks agree with that. The broader question is that with so limited an I/O structure: what is there for it to think about?
/2. Most of the brain's computation is not involved in cognition. That may be true. We don't really know./
I thought this at least was fairly well understood: We do in fact use our whole brains, as anything less would be a fantastic waste of resources, which evolution would have taken care of long ago. We have numerous human-specific adaptations to deal with the relatively massive brains we're carrying around.
When you're building an AI you may not need the neurons involved with, for example, breathing. That's what I'm talking about. I'm not a neuroscientist so I don't know for sure whether all the neurons we use for muscle and organ control do double-duty to help us cogitate.
"In animals, it is thought that the larger the brain, the more brain weight will be available for more complex cognitive tasks. However, large animals need more neurons to represent their own bodies and control specific muscles;[clarification needed][citation needed] thus, relative rather than absolute brain size makes for a ranking of animals that better coincides with the observed complexity of animal behaviour. The relationship between brain-to-body mass ratio and complexity of behaviour is not perfect as other factors also influence intelligence, like the evolution of the recent cerebral cortex and different degrees of brain folding,[5] which increase the surface of the cortex, which is positively correlated in humans to intelligence."
I think the idea that brain tissue is near optimally efficient is interesting. Yes, it's had a long time to evolve. But the same can be said about photosynthesis which is less efficient at capturing solar energy than PVs. The evolution of brain tissue was under constraints about something that could be made by biological systems from the resources we could eat. Is it not plausible that some very efficient computational substrate can be made, but requires minerals and chemical and industrial processes which would be toxic or impossible for life?
But the same can be said about photosynthesis which is less efficient at capturing solar energy than PVs.
The instantaneous efficiency is much lower, sure, but the lifetime efficiency is another question. The resource cost to create a plant is the nutrient/energy cost of producing and dispersing a seed. The energy cost of producing and installing a solar panel is enormous by comparison, and takes years if not decades to capture more resources than it took to produce.
Home solar panels capture 11-15% of incoming energy. Plant leaves capture 3-6%. Do you have solar panels now? If you could throw five dollars in seeds over your roof and get half that amount of electricity, would you?
There are hard limits. No matter how you try you can't perfectly simulate three atoms using two atoms. If it turns out we have to, in software, represent fifty percent of neuronal activity to create consciousness we're in real trouble. A dragonfly can take inputs from thousands of ommatidia and use them to track targets in space using only sixteen neurons. How many transistors would it take us to do the same? Take that ratio and apply it to the 86 billion neurons in the human brain and you have a rough idea of what it will take to create strong AI. The numbers aren't promising.