Personally I think it’s silly as hell. Qualia is obviously a biological component of experience… Not some weird thing that science will never be able to put in to words.
I’ve been listening to a lot of psychology podcasts lately and for some reason people seem obsessed with the idea despite you needing to make the same logical leaps to believe it as any sort of mysticism… Maybe I am just tripping idk


No mechanism for which to provide the data to make up said experience. No neurons.
I think this is insufficient. What do neurons have that the mechanical systems through which the motion of every particle on Earth influences every other particle don’t?
Neurons transmit information to the brain, the brain does the central processing of that information. There doesn’t appear to be means to encode and transmit data for Gaia the same way a neuron transcodes and transmits data for the brain.
The purpose of our neurons transcoding information seems to be directly related to a desire for homeostasis. Gaia may have some abstract form of homeostasis on an ecological entropy sense, but there is no central processing unit interpreting data.
There is nothing innately that the neurons “have” that makes this possible, the same way there is nothing an atom has that makes water wet, yet that property emerges from the atom.
Well, isn’t there? There’s a pretty massive amount of matter, particles flying around whose positions and velocities are all entangled because of their interactions, much like the particles in any mechanical system.
There isn’t exactly one in the brain, either. It’s like how in a computer’s CPU there’s not really any individual part that you can single out as “doing the computing.” There’s special purpose registers, general purpose registers, a control unit, a data path, an ALU, and so on. These things, by their interactions, cause computing to happen. As far as we know, a central nervous system is the same. There’s a huge number of neurons that are interacting with each other, some parts of the central nervous system appear to be linked to some specific function like long term memory, visual processing, etc; but you can’t really point to a way in which there’s a specific physical property of neurons that enables consciousness, as you said.
IMO I don’t think there’s a good way to dismiss the conclusion that very large physical systems like planet earth, or even the entire universe, interact in a way that’s not fundamentally different from how a brain interacts with itself, so unless there’s something other than the physical interactions between neurons at play, they must be able to experience the same consciousness.
There is no “enabling” consciousness. Consciousness is simply first person experience. The hum of the machine. It’s all calculations all at once aimed toward homeostasis. We can pick away every sense you have until there is no consciousness left.
Consciousness seems to be agentic. A unified experience that the universe obviously doesn’t have because we are subjectively experiencing it.
We can certainly point to specific physical properties that prevent consciousness from ever arising. If I snap froze your brain you would lose consciousness, if we snap unfroze you you would resume consciousness from the moment we unfroze you.
Are all computational devices conscious? If not, why not?
I do think we will recreate consciousness by computer means, our current computers are not conscious as we currently define it as they do not really attempt to achieve homeostasis.
You say that but the Earth experiences vast and slow movements in its magnetohydrodynamic inner layers, and through the distribution of momentum via force of gravity alone out to the moon and in from the gentle tugs of the other planets and the sun. If we assign a will to the universe, building consciousness seems to be a side effect of wanting to collect all matter in neat spheres.
I wouldn’t asign a will to the universe as that seems to be sneaking in teleological thinking. There is no apparent mechanism for the earth or universe to collect diagnostic data, but I am willing to admit maybe there is a giant magical brain we have yet to discover really far away. If we find that brain I’ll change my tune.
How exactly are you defining “collecting diagnostic data” without sneaking in teleological thinking?
I am not sure what this means. Are you suggesting an agent collecting diagnostic data is teleological? Or are you suggesting that the need for an agent to collect diagnostic data is teleological?
I can imagine plenty of things that collect data that aren’t teleological but I am probably just not understanding your question.
A universal will is inherently teleological. An individual will that developed randomly within the universe doesn’t have to be.
It seems as though you’re taking a “I’ve experienced thinking this one way (neurons) and therefore it could never happen any other way”. Really, you have a dataset of 1, though. It’s hard to build a theory off of one datapoint
Neurons have a specific electrical function and interconnection that we can demonstrate experimentally.
Yes but that specific mechanism has never been demonstrated to have anything to do with consciousness beyond the fact that interrupting them also interrupts consciousness. If we’re positing that consciousness is something that belongs to each individual part of the biological system, but then a singular consciousness that corresponds to the whole being can arise out of their interaction, why should that part of the process be limited purely to electrical connections between neurons?
I’m not saying it has to be limited to only neurons, but until there’s a way to test whether or not entire ecosystems can have a consciousness like ours then I am skeptical of these claims
the pacific ocean is a neural gap and I am throwing fistfuls of potassium into it to communicate with my friend in Japan