Modulo the HUGE computational cost of simulating these things, not to mention the non-trivial task of determining network parameters.
Surrogate models are a thing though, and it's going to be an interesting time as we gradually figure out what approximations and optimizations are 'acceptable', and what computations really are necessary for "intelligence" (whatever that might be).
Arguably, you'd need to simulate tens of billions of neurons in order to achieve human-level intelligence. So, even if you'd have correctly simulated a single neuron, there's still a lot to cover to achieve human-level AI. And even then. The few unfortunate cases where a human child has been reared outside of the normal environment (see Genie), having a human brain has turned out not to be enough to have intelligence that would be recognized as 'human-level'. So, apart from the simulated brain itself, you'd need to devise an appropriate training environment to use for training said brain to achieve human-level intelligence. Which would be a formidable feat in and of itself.
That doesn't follow at all. Human neurons aren't much different from mouse neurons, maybe chicken neurons, mosquito neurons. After you faithfully model a neuron you still need untold myriads of them and their interconnects to get human brain.
> Human neurons aren't much different from mouse neurons
We don't actually know this. Yeah the cerebellum and substantia nigra and other regions preserved across mammals are probably conserved in the neural structure as well. But the human neocortex has quite radically different gene expression compared to rats (which results in the morphological differences). There very well could be "more processing power per neuron" in humans vs rodents.
Also, sensations and feelings are not a logical/mathematical byproduct of the neurons; no matter how "well" you simulate "neurons", feelings and sensations will not emerge.
Unless you believe in a transcendent soul that could be the source of these sensations or feelings, this assertion doesn't make sense. Assuming there is no supernatural soul, it's logically impossible for anything humans experience to not arise from the human body.
This entire notion of qualia is a philosophical quagmire predicated on the idea that if you can imagine something, it must be true ("we can imagine a zombie that behaves exactly like a human, but doesn't have qualia at all"). It's actually as laughable as the "argument from perfection" for the existence of a god.
> You just need biological machinery and computers are not biological machinery capable of producing sensations.
This is a postulate, not an argument. My contention is that qualia are meaningless - like saying that there is such a thing as "feeling like you're computing the number 1000" for a processor, or "feeling like you are a really hard granite" for a piece of granite. Just because we can express it doesn't mean that it makes sense.
All of the conundrums about qualia go away if we just accept this. Alice would not in fact experience anything new when she saw red for the first time, if she knew everything about human cognition and the physical properties of the color red.
I do absolutely agree that we know almost nothing about how these processes actually happen in the brain, and most attempts at AI and bombastic predictions about replacing humans are off the mark by centuries. But that is no reason to assume that there is something completely different going on in animal brains than computation, in the wide sense of the Turing machine model.
I don't think we have the faintest clue what subjective experience of self (whatever you call it, qualia?) actually is, to be able to say it isn't artificially reproducible.
Said models still don't seem to be very close to being able to predict the behaviour of biological neurons though.
If they did, then we'd have invented artificial human-level intelligence.