Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is trying to train a neural network to match the behaviour of various mathematical models of biological neurons.

Said models still don't seem to be very close to being able to predict the behaviour of biological neurons though.

If they did, then we'd have invented artificial human-level intelligence.



Modulo the HUGE computational cost of simulating these things, not to mention the non-trivial task of determining network parameters.

Surrogate models are a thing though, and it's going to be an interesting time as we gradually figure out what approximations and optimizations are 'acceptable', and what computations really are necessary for "intelligence" (whatever that might be).


Arguably, you'd need to simulate tens of billions of neurons in order to achieve human-level intelligence. So, even if you'd have correctly simulated a single neuron, there's still a lot to cover to achieve human-level AI. And even then. The few unfortunate cases where a human child has been reared outside of the normal environment (see Genie), having a human brain has turned out not to be enough to have intelligence that would be recognized as 'human-level'. So, apart from the simulated brain itself, you'd need to devise an appropriate training environment to use for training said brain to achieve human-level intelligence. Which would be a formidable feat in and of itself.


Sensations cannot be produced, no matter how many neurons you simulate.

We can only model certain phenomena we deem important, but who can really say?

I'm happy with "less than human" simulations, in fact, I'd be disgusted with a perfect human simulation -- if they help us automate/solve problems.


That doesn't follow at all. Human neurons aren't much different from mouse neurons, maybe chicken neurons, mosquito neurons. After you faithfully model a neuron you still need untold myriads of them and their interconnects to get human brain.


> Human neurons aren't much different from mouse neurons

We don't actually know this. Yeah the cerebellum and substantia nigra and other regions preserved across mammals are probably conserved in the neural structure as well. But the human neocortex has quite radically different gene expression compared to rats (which results in the morphological differences). There very well could be "more processing power per neuron" in humans vs rodents.


Also, sensations and feelings are not a logical/mathematical byproduct of the neurons; no matter how "well" you simulate "neurons", feelings and sensations will not emerge.


Unless you believe in a transcendent soul that could be the source of these sensations or feelings, this assertion doesn't make sense. Assuming there is no supernatural soul, it's logically impossible for anything humans experience to not arise from the human body.

This entire notion of qualia is a philosophical quagmire predicated on the idea that if you can imagine something, it must be true ("we can imagine a zombie that behaves exactly like a human, but doesn't have qualia at all"). It's actually as laughable as the "argument from perfection" for the existence of a god.


There is no need for a "soul" to produce qualia.

You just need biological machinery and computers are not biological machinery capable of producing sensations.

The phenomena itself is not the same as the description of the phenomena, or a model/simulation.

No matter how well the model predicts the phenomena, it is still not the same thing.

But my postulation was that the simulation won't even be that good because we're still missing so much information.

Further, even IF we could "simulate" every quantum particle itself, the simulation would not be the same thing.


> You just need biological machinery and computers are not biological machinery capable of producing sensations.

This is a postulate, not an argument. My contention is that qualia are meaningless - like saying that there is such a thing as "feeling like you're computing the number 1000" for a processor, or "feeling like you are a really hard granite" for a piece of granite. Just because we can express it doesn't mean that it makes sense.

All of the conundrums about qualia go away if we just accept this. Alice would not in fact experience anything new when she saw red for the first time, if she knew everything about human cognition and the physical properties of the color red.

I do absolutely agree that we know almost nothing about how these processes actually happen in the brain, and most attempts at AI and bombastic predictions about replacing humans are off the mark by centuries. But that is no reason to assume that there is something completely different going on in animal brains than computation, in the wide sense of the Turing machine model.


Qualia is not meaningless because we are nothing without it.

Take sensations away and you turn into a vegetable in a couple of hours -- see solitary confinement, isolation tanks and so on.


I don't think we have the faintest clue what subjective experience of self (whatever you call it, qualia?) actually is, to be able to say it isn't artificially reproducible.


We know that computers are absolutely not the kind of machinery to produce sensations.

We know non-biological organisms don't produce qualia.

We know it involves chemical reactions, because we can change qualia with drugs.


LN model not always works https://t.co/RNidCEXWcC?amp=1




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: