Exactly! They are not subjects of experience or experiencers.
That's true. There's the famous epistemological problem of other minds/consciousnesses: https://plato.stanford.edu/entries/other-minds/
Given this problem, theoretical speculation is unavoidable; but there's still a big difference between highly plausible or probable assumptions and wildly implausible or improbable ones.
By distinguishing between (mere) physiological sensitivity and psychological sentience, I do not mean to imply that psychological (mental/experiential) phenomena are nonphysiological or nonphysical in the sense that materialism/physicalism about them is false. For I'm merely saying that there is a relevant difference between (ontologically) objective, nonexperiential sensitivity (and corresponding faculties) and (ontologically) subjective, experiential sentience, and that the former doesn't per se include the latter.
The scientific evidence available strongly confirms that brains are necessary for (ontologically) subjective states of organisms (= subjective experiences). There's no deductive logical proof they are, but it's by far the most plausible and most justified assumption in the light of our scientific knowledge.
As far as I know, the first animals with a central nervous system (brain) were flatworms called planarians. Whether they are subjects of experience is a matter of speculation, but it seems probable that all animals with brains are subjects of experience (of some primitive sensations at least). For once an organism is equipped with the requisite organ of consciousness, what prevents it from becoming and being conscious?
Plants have astonishing fitness- and survival-enhancing behavioral abilities, and I can accept terms such as "plant intelligence"; but what I don't accept are terms such as "plant experience" or "plant consciousness", because a plant's attempts at self-preservation aren't evidence for the presence of subjective experience in it. On the contrary, what plants provide evidence for is how much can be done successfully (achieved or accomplished) without any consciousness. There can be high degrees of dynamic functional and informational complexity in a material system lacking any subjective, experiential dimension (sophisticated AI robots being examples).Gee wrote: ↑November 19th, 2019, 12:45 amWithout language, subjectivity would be a difficult thing to prove, but I will try it with plants. The reason I chose plants is that they have a somewhat unique ability to grow where they want to grow and actually manipulate the growth of their bodies.
Take a seed from a tree and plant it where it can get just enough sunlight and water to grow, but it is slightly under a large piece of concrete. If it has enough of what it needs to survive, the tree will grow slightly crooked until it is above the concrete, then it will straighten itself to it's correct form pushing the concrete away or breaking it up. Decades later, there would be no indication that it started with a twisted form unless you cut it down and examined it's rings at the root. Most people would agree that it achieved it's correct form because of it's DNA. I agree.
Then take another seed and plant it close to a river that frequently jumps it's banks causing erosion. Decades later, you may find a malformed tree that has actually corrupted it's natural form in order to prevent itself from falling into the river. Trees in this situation will grow their roots into solid earth and actually grow extra limbs over the solid earth in order to preserve it's balance and it's life. These extra limbs are not natural to the tree, do not comply with the balanced form of the tree, but do conform to the balance of the situation. To me this indicates subjective experience and an effort to preserve the self.
No, I don't think trees (subjectively) sense, feel, or imagine anything. There's a book titled "Intelligent Complex Adaptive Systems", and plants are examples of such systems. But there's a relevant difference between nonconscious, nonexperiencing ICASs and conscious, experiencing ones, because an ICAS needn't be an EICAS, an Experiencing Intelligent Complex Adaptive System.
My point is that visually imagining a horse is similar to visually perceiving (seeing) a (physical) horse. So-called mental images of something are simulated sense-impressions of it.Gee wrote: ↑November 19th, 2019, 12:45 amMy thoughts on this are similar. I tend to think of awareness as causing a kind of bond. When you see a horse, an image of that horse is in your mind, but the horse it not. You have a mental bond with that horse (image), then you bond other thoughts, sensations, memories, etc., to the image of the horse building imagination.
First of all, I use "sensation" solely to refer to a sort of subjective experience; so all sensations are (ontologically) subjective by definition. This is not to say that sensations aren't physicochemical processes, but only that no such process is a sensation (in the psychological/phenomenological sense of the term) unless it involves subjective sensory qualia (sensa) which are innerly felt by organisms.Gee wrote: ↑November 19th, 2019, 12:45 amI can agree that the first physical sense was probably touch, but I have a problem with this. Tell me, what is the difference between a chemical reaction and a sensation? To me, a sensation differs because it sends a communication somewhere, whereas a chemical reaction does not necessarily communicate. So if first life did not have a brain, then where was the sense information sent? What received the information? I keep coming back to the idea that it had to be sent to the life form itself -- which would make some form of consciousness come before sensation could even exist.
"The irreducible minimum involved in mentality would seem to be the fact which we express by the phrase 'feeling somehow', e.g., feeling cross or tired or hungry. It seems to be logically possible that this characteristic, which we might call 'sentience', could belong to a thing or event which had no other mental characteristic."
(Broad, C. D. The Mind and its Place in Nature. 1925. Reprint, Abingdon: Routledge, 2000. p. 634)
The physicochemical mechanisms of action, reaction, interaction, and communication that we find in brainless animals, plants, protozoa, and bacteria work well without any subjective qualia/sensa. These organisms are all nonconscious "zombies".
There's a difference between objective awareness defined in purely functional-informational terms and subjective awareness defined in experiential terms (= phenomenal consciousness). Objective awareness (of information) can be ascribed to nonconscious "zombie agents", and even a device such as a motion detector can be said to be objectively aware of its environment.Gee wrote: ↑November 19th, 2019, 12:45 amOur thoughts differ here. I see simple awareness as something that is always on, but I see focus as directing it, turning it on and off, and guiding the strength of it. Focus would come from matter, so if the focus is too strong or dense, there is no awareness (like a rock). Intermediate focus could produce life, and different chemistry or make up would dictate how much, and of what, life can be aware. I have been playing with the idea that focus and awareness in balance is what causes intelligence. If the focus is too weak, we end up with awareness without focus -- which we tend to call Nirvana or "God".
Subjective awareness/consciousness/experience is not "always on". For example, it's off during a dreamless sleep and during general anesthesia.
An instinct is…Gee wrote: ↑November 19th, 2019, 12:45 amOur thoughts also differ here. Prior to consciousness, there were no species. All life forms of ALL species have survival instincts; this means that they have some feeling/emotion, awareness, and knowledge. This is not something that I made up, it is taught in biology that all species have survival instincts. For some reason, we have decided that consciousness can only come from a brain, so these survival instincts in other species have to be different. They are not. The survival instincts in all life line up very comfortably with Freud's drives in the Id, so I am talking about mind. If you think instincts and drives are different, please explain to me how they are different. While you are at it, explain why we can manipulate the hell out of a cell, but we can not synthesize one -- we can not start life.Consul wrote: ↑November 12th, 2019, 6:21 pmIt follows that there must have been some moment in the course of biological evolution when the "ignition" of consciousness took place in some individual belonging to some species. There's a difference between the evolutionary prehistory of consciousness and its evolutionary history and development, which begins with its sudden original "ignition".
"1. an innate propensity to emit a relatively fixed response to a stimulus. 2. Any natural and apparently innate drive or motivation, such as those associated with sex, hunger, and self-preservation." (Oxford Dictionary of Psychology)
"an innate tendency to behave in a particular way, which does not depend critically on particular learning experiences for its development and therefore is seen in a similar form in all normally reared individuals of the same sex and species. Much instinctive behaviour takes the form of fixed action patterns. These are movements that once started are performed in a stereotyped way unaffected by external stimuli." (Oxford Dictionary of Biology)
"behaviour that occurs as an inevitable stereotyped response to an appropriate stimulus, sometimes equivalent to species-specific behaviour." (Henderson's Dictionary of Biology)
The presence of certain instincts in all biological organisms doesn't entail and isn't even evidence for the presence of phenomenal consciousness/subjective experience in them.
If brains are unnecessary for consciousness, what alternative consciousness-realizing organs or organismal mechanisms are there in organisms which lack a central nervous system or even a nervous system?Gee wrote: ↑November 19th, 2019, 12:45 amMy thought is that people associate mind with brain, so they limit their own knowledge of what is and must be true. Subjectivity and minimal consciousness started with life, then as physical life evolved, so did consciousness until it reached the human consciousness that we experience.
Do you have a coherently intelligible concept of a borderline state of (phenomenal) consciousness which is neither definitely a nonexperience nor definitely an experience? – I haven't. (See my argument in a previous post of mine!)
I don't think a cognitively totally dysfunctional mind/brain can be a conscious mind/brain. The Oxford Dictionary of Psychology defines "cognition" broadly as "the mental activities involved in acquiring and processing information", so a cognitive mind/brain is an information processor (a neurobiological CPU); and there can be no experiencing mind/brain without an information-processing mind/brain. Consciousness is integrated into the cognitive architecture of the mind/brain (which is not to say that consciousness is reducible to cognition or intelligence).Gee wrote: ↑November 19th, 2019, 12:45 amI agree that a cognitive mind is not necessarily a conscious mind, as in AI. Can you see that a conscious mind is not necessarily a cognitive mind?Consul wrote: ↑November 12th, 2019, 6:21 pm Consciousness/experience is something in addition to those functional abilities and processes! A "cognitive mind" isn't per se a conscious mind! Note that I'm not saying that having a cognitive mind is not necessary for having a conscious mind, but only that it is not sufficient unless a cognitive mind has the functional and structural complexity of a central nervous system (brain).