Now I’m beginning to think you’re just being ornery. I’ll repeat myself: I use words like “ability” and “purpose” because those are the closest analogs to the concepts I have in mind. Also, “purpose” is not a property of a thing. “Purpose” is an explanation of why a thing came to exist. When we need to distinguish, we can say natural purpose versus intentional purpose. If you disagree, just try to put everything else in the context of how I am using the terms.
And I would say that there is an idea/concept for “inputs” independent of consciousness, and that the generation of a structure to represent that concept requires consciousness, and that the process of generating that structure involves something best referred to as “inputs”.I would say that rather than consciousness arising from things like 'inputs', it is the idea of 'inputs' that depend on consciousness.
Okay, so let’s talk about any aversion reflex. We’ll call the neurotransmitter produced by the c-fiber an “nsign” which has the “npurpose” of being “n-interpreted” as damage, thus generating another “nsign” which is “n-interpreted” by a muscle cell which “nresponds” by contracting. Is that better?It cannot be a 'sign' unless there is something it is a sign-to, something that interprets that sign.
Close, but not quite right. The ‘Cartesian theatre’ does not contain a consciousness. The Cartesian theatre is an entity which Dan Dennett correctly describes as competent without comprehension. The theatre creates the consciousness, in that the theatre has a repertoire of input/output sets, and any individual input/output that actually occurs is a conscious event. And yes, those inputs are signs ... no, wait ... I mean ... nsigns.Similarly, if we are to call the physiological events a 'sign', then that presupposes a 'Cartesian theatre', where a consciousness, that is distinct from those physiological events, takes note of what has happened in those cells and interprets it as a 'sign', a 'signal'. Similarly with 'concepts'.
I did not create the title of the thread. I absolutely agree that computers, right now, have a degree of consciousness. You might have noticed I put them at level 4.The title of the thread suggests material things like computers are currently lacking a special something, 'consciousness', that we humans have but they don't. But if consciousness/mind is simply a description of the organisation of matter, then computers already have it. They cannot avoid having it.
I don’t know about you, but I’m engaged on the project of explaining why computers already have consciousness at level 4 and that there is no logical impediment to creating computers at level 6 (or above?). Also, just in case someone has this idea, computers will not accidentally “become” conscious when no one is looking.So what project are we engaged on?
*