Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Functionally equivalent but no soul.

I think it's an incoherent concept. If the 'soul' is anything, there is some difference in the brain that would allow it. You can't have two people with identical brains and claim that one has a soul, unless you believe in magic.



It's been interesting seeing those with an entirely materialist understanding of the universe try to grapple with and deal with consciousness as LLMs have emerged.

To maintain that belief you have to either believe there is a physical source of consciousness, or that consciousness doesn't exist, neither of which most people can accept, yet LLMs are forcing people to confront.


You think most people can't accept that consciousness is a phenomenon with basis solely in physical reality?


Also LLMs represent, for most people I suspect, something entirely unconnected with notions of consciousness.


Is there a non-magical description of a soul? I thought it was an essentially magical concept. I’m not sure that makes it incoherent.


But you can have things like property dualism, strong emergentism, cognitive closue, neutral monism and panpsychism. Naturalistic philosophers don't appeal to the soul or magic, they just say there's more to nature than stuff described by physical sciences. You can disagree of course, but then you have to account for everything in nature, including consciousness, by only using physical constituents. And best of luck with that. Nobody has succeeded so far.

The problem is that subjective experiences don't fit well with objective explanations. Nagal laid this out in his paper, "What it's like to be a bat", and other philosophers like Chalmers expanded on it with p-zombies and blind color scientists.


I think the only property humans consider special in themselves are emotions. I have the following thesis. In evolution, exact pattern matching is computationally expensive. Just as I know my mother regardless of her clothes, hairstyle, her aging face, the match isn't exact. But brain uses heuristics to pump up 89 percent match to 100 percent match. This saves computational resources. These heuristics are what we call emotions. When snake like object appears in visual field, emotional heuristic pumps up slight resemblance to generate fear response. Analytical observation would take too much time. That's why emotions appear pre rational. Capgras syndrome shows when emotional heuristic malfunctions. Visual recognition is intact. But emotional heuristic which pumps up recognition score malfunctions. That's why patient recognises face but feels it's an imposter. These arguments prove emotions are mere computational shortcuts.


How do you explain the visual field? EM radiation isn't colored in the range visible to humans, that's just how we see it. There are no physical colored properties, rather there are wavelengths. But there are color experiences. Same goes for the other sensations. You don't need to go as far as emotions. Sensation alone is hard to explain in physical terms. Computation doesn't explain it either. Color values are just numeric properties we assign to the shades of our color experiences. But they aren't the experiences, they're just numeric values or symbols to be stored for something that can output wavelengths of light.


There is no analytical, rational reason to avoid being killed by a snake. Continuing to live has no analytical or rational purpose.

The only reason "avoid dangerous things like snakes so I continue to live" is a goal is emotions.


Think about it this way. We have System1 and System2. Emotions reside in System1. Now imagine we have build AI systems which are so fast they can compute complex heuristics of System2 at the speed of System1. Then AI's emotions will be more rational.


System 1 and system 2 are metaphors not actual systems.

Emotions do not only exist in the not actually a real thing system 1. People who take damage to the emotional centers of their brains become passive and apathetic not hyper intelligent system 2 Vulcans.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: