Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmEmpirical neuroscience focuses largely on human consciousness, seldom never consciousness per se.
What's the difference between consciousness and consciousness
per se?
Neuroscience is interested in nonhuman animal consciousness as well, but the big problem is that nonhuman animals cannot make introspective reports about their consciousness.
You answered your own question immediately. Yes, it's a problem that other species can't be as easily tested. I suspect that that has played a role in the abominable way many societies treat other sentient species.
However, as with the organisation of the systems of the human body (
any body, really), solar systems or the Milky Way or the deterministic knock on effects that rule our lives from cradle to the grave, these difficult-to-test concepts are no less important aspects of the fabric of reality than more readily testable ones, only more controllable. This leads to a bias of priority and emphasis towards that where the research path is clear. It's practical, but a limit that philosophy ideally must grapple with.
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmNeuroscience has nothing to say about qualia, and generally treats it as non-existent in much the same way as Skinner behaviouralists in psychology ignored the esoteric undercurrents for the sake of practicality, just recorded the cause and effects. So researchers no doubt will find brain structures causing all manner of events but not pertaining qualia as such. As things stand, it is generally framed as wakefulness or awareness, the practical side.
It is true that consciousness had been neglected by neuroscience and cognitive science (now unified as cognitive neuroscience) for a long time (due to the dominance of functionalism in the philosophy of mind), but this is no longer the case. Neuroscience is no longer only a science of cognition, intelligence, and behavior but also of consciousness; and as a science of consciousness it is a science of qualia, since
"the problem of consciousness is identical with the problem of qualia, because conscious states are qualitative states right down to the ground. Take away the qualia and there is nothing there." (John Searle,
Consciousness and Language, 2002. p. 26)
Very early days, equivalent to, say, Galileo's knowledge of the solar system.
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmTrouble is, where do we find reliable information about the subjective? Buddhism does have a rich history of meditators recording their observations but I expect that parsing useful material from the vast amounts of guesswork and obvious superstition would not always be easy.
Of course, a science of consciousness needs both
third-person data and
first-person (introspective, phenomenological) data. The epistemic reliability of introspection and introspective reports is a central issue.
There's the rub and a constant problem for researchers. How to understand the subjective when subjective testimony is amongst the least reliable forms of evidence? Ideally one would learn of states generally not attainable in a laboratory setting too.
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmThe brain is like the board of a company, but the gut owns it. I do concede that a hostile takeover appears to be in train which already has us thinking about this in reverse, treating brains as the central aspect of an organism. Brains have long minimised the importance of the gut, and this is reflected in the ultimate dream of digitising minds to be independent of vulnerable wetware bodies.
The nervous system is larger than the central nervous system, also including the enteric nervous system. There is a
"gut-brain axis", but the enteric nervous system is certainly not an organ of consciousness.
But it certainly plays a major role in generating qualia, which I see as the result of interaction of those body parts, and others. Metabolic functions and membranes were the first fundamentals of organisms, precursors to the gut and nervous systems, and neither could be viable without the other. The gut-brain axis would seem to be the indivisible entity that we separate for practicality's sake.
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pm
Yes, an amplifier but mostly a shaper. There would be no one organ that produces qualia just as no one organ produces life.
But there is one organ which produces qualia, viz. the brain. There can no longer be any reasonable scientific doubt that subjective experiences or appearances are realized by and in the brain, and by nothing else and nowhere else in the universe. The cerebral transformation of objective neural signals into subjective "impressions and ideas" is unlike a mere shaping of a lump of clay by a potter.
Just wait and see. My suspicion is that failure to create AI that experiences will be attributed to synthetics or the need to create more complex brains. I suspect that you rather need the equivalent of a gut to help make qualia happen. Until then it's just processing electricity and information like any other appliance.
The trouble with the the potter analogy is that the potter him or herself is a major part of the artwork - but not all!
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmSo you keep saying. However, our overarching consciousness did not come from nowhere. No, this kind of complex brainpower evolved from simpler kinds, and simpler ones before that, and so on.
Of course, there is an evolutionary prehistory of consciousness which is associated with the evolutionary development of nervous systems, which finally resulted in the development of central nervous systems (brains).
Of course, the psychological/phenomenological sentience of organisms is evolutionarily preceded by their electro- and then neurophysiological sensitivity (reactivity); but there is an essential difference: The former is constituted by
ontologically subjective phenomena (sense-qualia), whereas the latter is constituted by
ontologically objective phenomena. For there is nothing it is like for an organism to have nothing more than electro-/neurophysiological sensitivity.
Yet where do you draw the line and deem that X organism experiences and Y organism just processes? After all, if we attribute qualia to C. elegans and its hundred of neurons, what of the human gut and its millions or neurons? We might figure that it's a matter of configuration, that it's not just numbers but the nature of the wiring. Still, there is a case for each to benefit from simple sensing, which of course is why these entities ended up with neurons.
I think there is a terribly fuzzy line here between phenomenal and functional consciousnesses, sensing and reflexes. I think it feels like something to have a reflex action.
Consul wrote: ↑January 16th, 2019, 11:51 amGreta wrote: ↑January 15th, 2019, 10:52 pmThe way I see it, the basic unit of consciousness is the reflex. Using the water analogy, reflexes are pools and consciousness are rivers, which could be thought of as a complex series of pools and, when there's enough of them then larger, riverlike dynamics come into play.
Emotions, for example, are huge evolved suites of reflexes that work in concert when a sufficiently-brained organism is presented with stimuli. They are, in essence, naturally selected subroutines, called under certain circumstances. Basic emotions like fear (fight or flight/startle) and pleasure/satiation are very common in nature, generally pertaining to equilibrium. More complex emotions will, of course, pertain to social living.
Emotions are just very complex reflex responses that intelligent machines, with their much faster processing speeds, will not need. A human faced with an attacker will go through useful reflex responses - adrenaline, life in breathing, heart rate and blood pressure to force feed organs and muscles for quick response, plus tendency to fight or flee. In much less than the time needed for human bodies to prepare for such a problem, an intelligent machine would simply calculate the optimal moment and mode of attack or defence needed and execute it.
Reflexes have nothing to do with (phenomenal) consciousness as long as they don't involve any subjective experiences. No matter how complex and sophisticated its reactive behavior is, an organism is nothing but a zombie agent as long as it doesn't subjectively feel or sense anything.
"The irreducible minimum involved in mentality would seem to be the fact which we express by the phrase 'feeling somehow'[.]"
On what basis do we determine 'feeling somehow'? Brains and nervous systems might not be enough, perhaps dedicated to only complex "zombie agent" reflexes. Given the indeterminacy of the situation, plus the human history of dismissing that which it deems insignificant as nothing, I suspect that any consciousness or feeling that we attribute to another entities is still significantly underestimated.