Phenomenal colours, sounds, etc. do exist, which is not to say that their apparent location is their real location. For example, when I look at a red wall I experience the red as being out there on the surface of the wall, but this isn't really the case. Identifying the real location of phenomenal qualities in physical space is a big problem indeed, but I think they occur in brains. This is not to say that my brain is red when I see a red wall. To be more precise, my theory is that phenomenal qualities such as colors are higher-order (gestalt) qualities of certain structural neurological properties of the CNS. This means e.g. that a certain structural property of the CNS has the subjective quality of phenomenal redness and thus is phenomenally red. This also means that phenomenal (aka "secondary") qualities are part of certain brain processes.Djacob7 wrote:JamesOfSeattle wrote:
"Theoretically, a zombie doesn't have the experience even though it thinks it does."
That's why I firmly believe that we are zombies: We think we have experiences when all they are are neuronal coding algorithms.
"Red" doesn't exist anywhere - not even in our brains. "Red" is a neuronal code not for us to understand; we see gibberish and call it "red". "Brightness" and "darkness" also don't exist anywhere for the same reasons.
The latter is a misnomer insofar as mere access consciousness qua (global) internal accessibility/availability of information is not a form of consciousness at all. "Access consciousness" is a purely functionalist concept, such that even phenomenally nonconscious zombies can be said to be access-conscious.Djacob7 wrote:[Dennett] says there's no such thing as consciousness (which I fully believe), "however", we have "Access consciousness."
"I don't believe there is any form of access that deserves to be called consciousness without phenomenality. After all, access is cheap. When an ordinary desktop computer calls up information from a hard drive or responds to inputs from a user, it is accessing information, but there is little temptation to say that the computer is conscious. Information access seems conscious in the human case when and only when it is accompanied by phenomenal experience.
(Prinz, Jesse J. The Conscious Brain: How Attention Engenders Experience. New York: Oxford University Press, 2012. pp. 5-6)
"Functionalists in particular try to reduce consciousness to some input-output function or causal role in the control of behaviour. Along the functionalist lines of thought, consciousness has been defined as 'access consciousness'. Access refers to the output function of conscious information: Consciousness is the type of information that accesses many other cognitive systems – motor systems – and thereby also is able to guide or control external behaviour, especially verbal reports about the contents of (reflective) consciousness. According to the functionalist definition, then, conscious information is only the information in the brain that fulfils the access function. 'Access' refers to global informational access, especially the access to output systems within the human cognitive system.
If consciousness is identified with the global access function of information, the ability to report the contents of consciousness verbally or to respond externally to stimuli is at least implied as necessary for consciousness, because 'access' generally means access to output systems. Furthermore, the access definition of consciousness reduces consciousness to a certain type of information processing (or input-output function) and hence suffers from all the same problems as functionalism does as a theory of consciousness. It leaves out qualia, and it rejects the possibility that there could be pure phenomenal consciousness that is independent of selective attention, reflective consciousness, verbal report or control of output mechanisms."
(Revonsuo, Antti. Consciousness: The Science of Subjectivity. New York: Psychology Press, 2010. p. 95)