Consciousness, what is and what it requires?

Discuss any topics related to metaphysics (the philosophical study of the principles of reality) or epistemology (the philosophical study of knowledge) in this forum.
Post Reply
User avatar
Consul
Posts: 3328
Joined: February 21st, 2014, 6:32 am
Location: Germany

Re: Consciousness, what is and what it requires?

Post by Consul » November 25th, 2019, 8:09 pm

RJG wrote:
November 25th, 2019, 7:06 pm
Consul, you are falsely equating "experience" as "conscious experience". These are two different things.
In the first-order sense, conscious experience = (subjective) experience; and in the higher-order sense, conscious experience = (subjective) experience of which its subject is (cognitively/introspectively) conscious (which is innerly cognized or perceived by its subject). In the higher-order sense, conscious experience ≠ (subjective) experience.
"We may philosophize well or ill, but we must philosophize." – Wilfrid Sellars

BigBango
Posts: 343
Joined: March 15th, 2018, 6:15 pm

Re: Consciousness, what is and what it requires?

Post by BigBango » November 25th, 2019, 10:09 pm

Consul wrote:
November 25th, 2019, 4:54 pm
BigBango wrote:
November 25th, 2019, 1:30 am
First of all I did not say "your" consciousness "just is". I said "consciousness "just is". Certainly the sensed world of objects also "comes into being every time I awake from a dreamless sleep and ceases to be every time I fall into a dreamless sleep." However I am sure that you, as a materialist, believe in a world of objects and they still exist during your dream time. You need to apply your logic to your own assumptions about the nature of reality. If you did we would have no consciousness in the world and no objects.
:?:
I am just saying if you can dismiss consciousness as not always existing because of dreamless sleep then why can materialist's claim that objects exist between their falling asleep and waking. This is not a big point it just results from your turning my reference to "generic" consciousness into a statement about your consciousness.
Consul wrote:
BigBango wrote:
November 25th, 2019, 1:30 am
I accept the tenants of "dual aspect philosophers" like Nagel, Searle even Leibniz. Consciousness never evolved from the material objects of science but has always existed in relation to the physical.
Searle is not a fundamentalist but an emergentist about consciousness!
I don't know what you mean by "fundamentalist". He is a reluctant dual aspect philosopher.
Consul wrote:
BigBango wrote:
November 25th, 2019, 1:30 am
Again see Tamminen. For Leibniz there are monads that are not divisible and are the source of individual conscious identities. These monads instantiate themselves in our world of physicality and that physical world is infinitely divisible. The "conscious monads" have always existed and they instantiate themselves in the physical world. They dominate that physical world and shape it into particular physical/conscious entities that are evolved versions of themselves.
When you solve the hard problem of consciousness, without having to evolve it from the inanimate objects of your particular form of materialism and also exactly explain the "beginning" of life, then you might get me to see things differently.
Under one interpretation, Leibniz's worldview is no different from Berkeley's: The world ultimately consists of nothing but immaterial minds/souls/spirits and their immaterial "ideas", i.e. their mental/experiential properties or states (what Leibniz calls "perceptions"), with apparent bodies or physical objects really being nothing but complexes of mental ideas or impressions.
I disagree with your interpretation. (See Monadology by Leibniz, p 148).
"1. The object of this discourse, the monad, is nothing else than a substance, which enters into the composites; simple meaning, which has no parts.
2. And there must be simple substances, since there are composites; for the composite is nothing else than an accumulation or aggregate of the simples.
3. But where there are no parts, neither extension, nor figure, nor divisibility is possible. Thus these monads are the veritable atoms of nature, and, in one word, the elements of all things.
...
90. ..."

These monads are individuals, non of them are alike. For that reason, even while they are not physically divisible that have varying qualities and their complexes do not just merge into a mental entity that has no extension. So from a third person "objective" view we see a composite physical substance.

User avatar
Consul
Posts: 3328
Joined: February 21st, 2014, 6:32 am
Location: Germany

Re: Consciousness, what is and what it requires?

Post by Consul » November 26th, 2019, 4:51 pm

BigBango wrote:
November 25th, 2019, 10:09 pm
I don't know what you mean by "fundamentalist". He is a reluctant dual aspect philosopher.
According to the fundamentalistic/non-emergentistic version of naturalistic property dualism, mental properties and physical properties have always (co)existed in the universe as fundamental kinds of natural properties.
BigBango wrote:
November 25th, 2019, 10:09 pm
Consul wrote:Under one interpretation, Leibniz's worldview is no different from Berkeley's: The world ultimately consists of nothing but immaterial minds/souls/spirits and their immaterial "ideas", i.e. their mental/experiential properties or states (what Leibniz calls "perceptions"), with apparent bodies or physical objects really being nothing but complexes of mental ideas or impressions.
I disagree with your interpretation. (See Monadology by Leibniz, p 148).
"1. The object of this discourse, the monad, is nothing else than a substance, which enters into the composites; simple meaning, which has no parts.
2. And there must be simple substances, since there are composites; for the composite is nothing else than an accumulation or aggregate of the simples.
3. But where there are no parts, neither extension, nor figure, nor divisibility is possible. Thus these monads are the veritable atoms of nature, and, in one word, the elements of all things.
...
90. ..."

These monads are individuals, non of them are alike. For that reason, even while they are not physically divisible that have varying qualities and their complexes do not just merge into a mental entity that has no extension. So from a third person "objective" view we see a composite physical substance.
I wrote "Under one interpretation…". There's another text-based interpretation, according to which all compound bodies or organisms are ultimately communities of spiritual monads (souls).

"According to Leibniz, if the only genuinely real beings are mind-like simple substances, then bodies, motion, and everything else must result from or be derivative of those simple substances and their perceptual states. In a typical statement of his idealism, Leibniz says, “I don't really eliminate body, but reduce [revoco] it to what it is. For I show that corporeal mass [massa], which is thought to have something over and above simple substances, is not a substance, but a phenomenon resulting from simple substances, which alone have unity and absolute reality.” (G II 275/AG 181) Yet, this position, denying the reality of bodies and asserting that monads are the grounds of all corporeal phenomena, as well as its metaphysical corollaries has shocked many. Bertrand Russell, for example, famously remarked in the Preface to his book on Leibniz that he felt that “the Monadology was a kind of fantastic fairy tale, coherent perhaps, but wholly arbitrary.” And, in perhaps the wittiest and most biting rhetorical question asked of Leibniz, Voltaire gibes, “Can you really believe that a drop of urine is an infinity of monads, and that each of these has ideas, however obscure, of the universe as a whole?” (Oeuvres complètes, Vol. 22, p. 434) Well, if you are Leibniz, you can. But how so?"

Gottfried Wilhelm Leibniz: https://plato.stanford.edu/entries/leibniz/
"We may philosophize well or ill, but we must philosophize." – Wilfrid Sellars

Gee
Posts: 285
Joined: December 28th, 2012, 2:41 am
Location: Michigan, US

Re: Consciousness, what is and what it requires?

Post by Gee » November 27th, 2019, 12:47 am

Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
So what you are stating is that bacteria, fungi, and plants do not have subjective experience.
Exactly! They are not subjects of experience or experiencers.
And yet you stated: "There are three basic kinds of consciousness/experience: sensation, emotion, and imagination." All life, bacteria, fungi, and plants, sense and have sensation, so you are not being consistent in your position.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
In reality, it is difficult to prove that any specie has subjective experience, even humans…
That's true. There's the famous epistemological problem of other minds/consciousnesses: https://plato.stanford.edu/entries/other-minds/

Given this problem, theoretical speculation is unavoidable; but there's still a big difference between highly plausible or probable assumptions and wildly implausible or improbable ones.


Speculation is always avoidable in your conclusions. The biggest differences between "highly plausible or probable assumptions and wildly implausible or improbable" assumptions is belief and evidence. I like evidence. I do not like assumptions as they bring out the worst in philosophy and science.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
…as we can easily state that all consciousness is physiological and not psychological.
By distinguishing between (mere) physiological sensitivity and psychological sentience, I do not mean to imply that psychological (mental/experiential) phenomena are nonphysiological or nonphysical in the sense that materialism/physicalism about them is false. For I'm merely saying that there is a relevant difference between (ontologically) objective, nonexperiential sensitivity (and corresponding faculties) and (ontologically) subjective, experiential sentience, and that the former doesn't per se include the latter.
No. What you are actually saying is that if there is no brain, then there is no mind, so there can be no psychological experience. You can not separate the two ideas in your thinking. I suspect that this is because you do not actually study mind -- you study humans and the brain.

Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
We have decided that subjectivity requires a brain, but conversely, we will not confirm that all species with a brain also possess subjectivity. These rules on subjective experience are too inconsistent for me to see any truth in them.
The scientific evidence available strongly confirms that brains are necessary for (ontologically) subjective states of organisms (= subjective experiences). There's no deductive logical proof they are, but it's by far the most plausible and most justified assumption in the light of our scientific knowledge.
This is pure BS. It is an assumption. It is also a biased assumption made by neurology and people who study the brain and humans-- not all scientific knowledge. It also assumes the study of the mind is a study of the brain -- not consciousness -- and is not evidence of exclusivity of subjective experience in the brain.

Consul wrote:
November 19th, 2019, 7:07 pm
As far as I know, the first animals with a central nervous system (brain) were flatworms called planarians. Whether they are subjects of experience is a matter of speculation, but it seems probable that all animals with brains are subjects of experience (of some primitive sensations at least). For once an organism is equipped with the requisite organ of consciousness, what prevents it from becoming and being conscious?
You have a point, except for the idea that we can only speculate and the "requisite organ of consciousness" part.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
Without language, subjectivity would be a difficult thing to prove, but I will try it with plants. The reason I chose plants is that they have a somewhat unique ability to grow where they want to grow and actually manipulate the growth of their bodies.

Take a seed from a tree and plant it where it can get just enough sunlight and water to grow, but it is slightly under a large piece of concrete. If it has enough of what it needs to survive, the tree will grow slightly crooked until it is above the concrete, then it will straighten itself to it's correct form pushing the concrete away or breaking it up. Decades later, there would be no indication that it started with a twisted form unless you cut it down and examined it's rings at the root. Most people would agree that it achieved it's correct form because of it's DNA. I agree.

Then take another seed and plant it close to a river that frequently jumps it's banks causing erosion. Decades later, you may find a malformed tree that has actually corrupted it's natural form in order to prevent itself from falling into the river. Trees in this situation will grow their roots into solid earth and actually grow extra limbs over the solid earth in order to preserve it's balance and it's life. These extra limbs are not natural to the tree, do not comply with the balanced form of the tree, but do conform to the balance of the situation. To me this indicates subjective experience and an effort to preserve the self.
Plants have astonishing fitness- and survival-enhancing behavioral abilities, and I can accept terms such as "plant intelligence"; but what I don't accept are terms such as "plant experience" or "plant consciousness", because a plant's attempts at self-preservation aren't evidence for the presence of subjective experience in it. On the contrary, what plants provide evidence for is how much can be done successfully (achieved or accomplished) without any consciousness. There can be high degrees of dynamic functional and informational complexity in a material system lacking any subjective, experiential dimension (sophisticated AI robots being examples).
Do you see the underlined above? Do you know what "survival" means? You just stated that plants will protect their selves. Their SELVES. They will behave in a way that protects the "self". This has nothing to do with sophisticated AI -- that is a strawman argument.

After listening to comparisons of AI to consciousness over and over in another forum, I finally became frustrated and asked: How complex does AI have to get before it can compare to a blade of grass? I only got one response to that question, and that was from a microbiologist, who stated that there is no comparison. AI can not even reach the complexity of a cell, much less a whole blade of grass.

Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
So do you think that tree sensed the pull of the river, was afraid, and imagined growing extra branches as a solution? I, myself, have no idea of how trees do this. (chuckle)
No, I don't think trees (subjectively) sense, feel, or imagine anything. There's a book titled "Intelligent Complex Adaptive Systems", and plants are examples of such systems. But there's a relevant difference between nonconscious, nonexperiencing ICASs and conscious, experiencing ones, because an ICAS needn't be an EICAS, an Experiencing Intelligent Complex Adaptive System.

Are you telling me that plants sense things objectively? Please explain how that works. One can observe objectively, but sensing is subjective.

Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
My thoughts on this are similar. I tend to think of awareness as causing a kind of bond. When you see a horse, an image of that horse is in your mind, but the horse it not. You have a mental bond with that horse (image), then you bond other thoughts, sensations, memories, etc., to the image of the horse building imagination.
My point is that visually imagining a horse is similar to visually perceiving (seeing) a (physical) horse. So-called mental images of something are simulated sense-impressions of it.
Yes it is similar, but how does that work with consciousness? How does the image go from the horse, to the eye, to the brain, and then to the mind? I think awareness actually causes a bond.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
I can agree that the first physical sense was probably touch, but I have a problem with this. Tell me, what is the difference between a chemical reaction and a sensation? To me, a sensation differs because it sends a communication somewhere, whereas a chemical reaction does not necessarily communicate. So if first life did not have a brain, then where was the sense information sent? What received the information? I keep coming back to the idea that it had to be sent to the life form itself -- which would make some form of consciousness come before sensation could even exist.
First of all, I use "sensation" solely to refer to a sort of subjective experience; so all sensations are (ontologically) subjective by definition. This is not to say that sensations aren't physicochemical processes, but only that no such process is a sensation (in the psychological/phenomenological sense of the term) unless it involves subjective sensory qualia (sensa) which are innerly felt by organisms.
So do you think sensations are outerly felt by the organisms?
Consul wrote:
November 19th, 2019, 7:07 pm
"The irreducible minimum involved in mentality would seem to be the fact which we express by the phrase 'feeling somehow', e.g., feeling cross or tired or hungry. It seems to be logically possible that this characteristic, which we might call 'sentience', could belong to a thing or event which had no other mental characteristic."

(Broad, C. D. The Mind and its Place in Nature. 1925. Reprint, Abingdon: Routledge, 2000. p. 634)
You keep doing this. I have noted it throughout this thread that you will take a statement or paragraph from a book and use it as an argument. This is an argument from authority, but there is no authority on consciousness, only theories, positions, and opinions. You are also taking it out of context, so unless I have read that book, I have no idea of the relevance. It looks a lot like "cherry picking" to me.
Consul wrote:
November 19th, 2019, 7:07 pm
The physicochemical mechanisms of action, reaction, interaction, and communication that we find in brainless animals, plants, protozoa, and bacteria work well without any subjective qualia/sensa. These organisms are all nonconscious "zombies".


That is a crock! You do realize that the word "zombie" actually means that it is dead -- don't you? Dead animals, plants, protozoa, and bacteria don't do anything, because they are dead. Zombies are not actually real.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
Our thoughts differ here. I see simple awareness as something that is always on, but I see focus as directing it, turning it on and off, and guiding the strength of it. Focus would come from matter, so if the focus is too strong or dense, there is no awareness (like a rock). Intermediate focus could produce life, and different chemistry or make up would dictate how much, and of what, life can be aware. I have been playing with the idea that focus and awareness in balance is what causes intelligence. If the focus is too weak, we end up with awareness without focus -- which we tend to call Nirvana or "God".
There's a difference between objective awareness defined in purely functional-informational terms and subjective awareness defined in experiential terms (= phenomenal consciousness). Objective awareness (of information) can be ascribed to nonconscious "zombie agents", and even a device such as a motion detector can be said to be objectively aware of its environment.
You know that "zombies" are not actually alive don't you. Zombies are dead, as dead as motion detectors.
Consul wrote:
November 19th, 2019, 7:07 pm
Subjective awareness/consciousness/experience is not "always on". For example, it's off during a dreamless sleep and during general anesthesia.
Awareness is always on -- until you are dead -- then it is off for you. Even if the rational aspect of mind takes a nap, the body is still aware of the need to breathe, digest, and continue in it's work and maintenance to keep us alive.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
Our thoughts also differ here. Prior to consciousness, there were no species. All life forms of ALL species have survival instincts; this means that they have some feeling/emotion, awareness, and knowledge. This is not something that I made up, it is taught in biology that all species have survival instincts. For some reason, we have decided that consciousness can only come from a brain, so these survival instincts in other species have to be different. They are not. The survival instincts in all life line up very comfortably with Freud's drives in the Id, so I am talking about mind. If you think instincts and drives are different, please explain to me how they are different. While you are at it, explain why we can manipulate the hell out of a cell, but we can not synthesize one -- we can not start life.
An instinct is…

"1. an innate propensity to emit a relatively fixed response to a stimulus. 2. Any natural and apparently innate drive or motivation, such as those associated with sex, hunger, and self-preservation." (Oxford Dictionary of Psychology)

"an innate tendency to behave in a particular way, which does not depend critically on particular learning experiences for its development and therefore is seen in a similar form in all normally reared individuals of the same sex and species. Much instinctive behaviour takes the form of fixed action patterns. These are movements that once started are performed in a stereotyped way unaffected by external stimuli." (Oxford Dictionary of Biology)

"behaviour that occurs as an inevitable stereotyped response to an appropriate stimulus, sometimes equivalent to species-specific behaviour." (Henderson's Dictionary of Biology)

The presence of certain instincts in all biological organisms doesn't entail and isn't even evidence for the presence of phenomenal consciousness/subjective experience in them.
I don't know you well enough to try to discern if you are making a strawman argument or if you are actually unaware of the difference between instincts and survival instincts. The only quote above that is relevant is the one from the Oxford Dictionary of Psychology.

I did 20 pages on the subject of instincts in a science forum with the help of a neurologist, who also worked on an AI project, an animal behaviorist, and a few other knowledgeable members. There is not much that I didn't learn about instincts, but the paramount thing that I learned is that no one actually understands them.

Survival instincts are different. We know a great deal about them, what triggers them, and how they work through hormones and pheromones. We know which parts of the body produce these chemicals, we know that the brain is sitting in a chemical bath of them, we know how individual hormones relate to individual instincts, we know that hormones can affect emotion, moods, and turn off and on different aspects of mind, and we use these chemicals to treat schizophrenia and other mental disorders. We know that monks, who are trying to reach Nirvana, often use fasting and sleep deprivation in order to succeed, and we know that fasting and sleep deprivation affect hormone levels. We also know that hormones cause homeostasis and preserve life in a life form and within a specie, but can also turn off and on the switches in DNA, which means that they could possibly influence evolution. To say that hormones and survival instincts have nothing to do with phenomenal consciousness is ludicrous.

Look up survival instincts (self preservation) and look up hormones/pheromones -- Wiki will do as these are very basic ideas. Then we can talk.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
My thought is that people associate mind with brain, so they limit their own knowledge of what is and must be true. Subjectivity and minimal consciousness started with life, then as physical life evolved, so did consciousness until it reached the human consciousness that we experience.
If brains are unnecessary for consciousness, what alternative consciousness-realizing organs or organismal mechanisms are there in organisms which lack a central nervous system or even a nervous system?
I would say hormones and pheromones. Nervous systems and even the CNS are necessary to communicate within the body; they have no direct access to mind (except on the cellular level) as far as I can see -- much like computers, or "zombies". The brain itself does have access to mind, but then it is swimming in a bath of hormones. Most of what hormones do is access mind internally and externally. All species have some type/s of hormones.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
If there was an "ignition", it happened a long time ago, and I doubt that it happened in a singular event.
Do you have a coherently intelligible concept of a borderline state of (phenomenal) consciousness which is neither definitely a nonexperience nor definitely an experience? – I haven't. (See my argument in a previous post of mine!)
You mistook my meaning. What I doubt is that life could start with a singular event. First consciousness (the unconscious aspect) would have to evolve enough to support life, then life would have to have evolved in a mass, rather than in a single life form.

If you want an example of something that is and is not a life form, you could consider viruses. Although viruses have DNA, they do not have survival instincts. When in a body, they will mimic survival instincts and maintain themselves, reproduce, and even evolve, just like bacteria, but outside of a body, they will just sit there like a rock. There are some theories about viruses being responsible for life starting or for the evolution of life, but I have not accepted any of them.
Consul wrote:
November 19th, 2019, 7:07 pm
Gee wrote:
November 19th, 2019, 12:45 am
I agree that a cognitive mind is not necessarily a conscious mind, as in AI. Can you see that a conscious mind is not necessarily a cognitive mind?
I don't think a cognitively totally dysfunctional mind/brain can be a conscious mind/brain.

You come perilously close to stating that the mentally handicapped can not be conscious here.
Consul wrote:
November 19th, 2019, 7:07 pm
The Oxford Dictionary of Psychology defines "cognition" broadly as "the mental activities involved in acquiring and processing information", so a cognitive mind/brain is an information processor (a neurobiological CPU); and there can be no experiencing mind/brain without an information-processing mind/brain. Consciousness is integrated into the cognitive architecture of the mind/brain (which is not to say that consciousness is reducible to cognition or intelligence).
Do you have any idea of what the unconscious aspect of mind is? Or how it works? It is unfortunate that we use the word, unconscious, as that makes people assume that it is NOT consciousness. Nothing could be farther from the truth. All life does not possess a rational conscious aspect of mind, but it does possess consciousness in the unconscious aspect of mind, and there is a lot of evidence to support this. To understand it, you really need to understand the unconscious.

Gee

User avatar
Consul
Posts: 3328
Joined: February 21st, 2014, 6:32 am
Location: Germany

Re: Consciousness, what is and what it requires?

Post by Consul » November 27th, 2019, 9:19 pm

Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
Exactly! They are not subjects of experience or experiencers.
And yet you stated: "There are three basic kinds of consciousness/experience: sensation, emotion, and imagination." All life, bacteria, fungi, and plants, sense and have sensation, so you are not being consistent in your position.
Yes, I am, because I've already stressed several times that there is a relevant difference between (mere) physiological sensitivity (reactivity/responsivity to physical/chemical stimuli or signals) and psychological sentience (experience of sensations). When nonsentient organisms are said to "sense" things, this is "objective sensing" (or "objective perceiving") in the context of merely physiological sensitivity. For example, bacteria are capable of "quorum sensing", which has nothing to do with subjective sensing:

"Quorum sensing is the regulation of gene expression in response to fluctuations in cell-population density. Quorum sensing bacteria produce and release chemical signal molecules called autoinducers that increase in concentration as a function of cell density. The detection of a minimal threshold stimulatory concentration of an autoinducer leads to an alteration in gene expression. Gram-positive and Gram-negative bacteria use quorum sensing communication circuits to regulate a diverse array of physiological activities. These processes include symbiosis, virulence, competence, conjugation, antibiotic production, motility, sporulation, and biofilm formation. In general, Gram-negative bacteria use acylated homoserine lactones as autoinducers, and Gram-positive bacteria use processed oligo-peptides to communicate. Recent advances in the field indicate that cell-cell communication via autoinducers occurs both within and between bacterial species. Furthermore, there is mounting data suggesting that bacterial autoinducers elicit specific responses from host organisms. Although the nature of the chemical signals, the signal relay mechanisms, and the target genes controlled by bacterial quorum sensing systems differ, in every case the ability to communicate with one another allows bacteria to coordinate the gene expression, and therefore the behavior, of the entire community. Presumably, this process bestows upon bacteria some of the qualities of higher organisms. The evolution of quorum sensing systems in bacteria could, therefore, have been one of the early steps in the development of multicellularity."

Source: https://www.ncbi.nlm.nih.gov/pubmed/11544353
Gee wrote:
November 27th, 2019, 12:47 am
Speculation is always avoidable in your conclusions. The biggest differences between "highly plausible or probable assumptions and wildly implausible or improbable" assumptions is belief and evidence. I like evidence. I do not like assumptions as they bring out the worst in philosophy and science.
No, they don't. Assumptions (as I use this term) are weak beliefs or weaker than beliefs, and they are much weaker than convictions, being an expression of doxastic modesty in case of epistemic uncertainty. Assumptions are judgements or opinions without conviction. I equate them neither with (a priori) presuppositions nor with non-evidence-based or non-justified assertions.

You enter the realm of speculative assumptions or speculations as soon as your conclusions go beyond what is deductively or inductively inferable from your (evidence-describing) premises.
Gee wrote:
November 27th, 2019, 12:47 am
No. What you are actually saying is that if there is no brain, then there is no mind, so there can be no psychological experience. You can not separate the two ideas in your thinking. I suspect that this is because you do not actually study mind -- you study humans and the brain.
It depends on what we mean by "mind". If minds are defined not as mental substances but as complexes of mental attributes, then what attributes are necessary (and sufficient) for having a mind? Some say that having consciousness is necessary, such that nonconscious beings are mindless beings. If that's true, brainless beings are mindless beings, because consciousness is brain-dependent. For example, Galen Strawson thinks that…

"[O]nly experiencing beings can have mental properties (be in mental states, etc.). …The basic idea here is very simple: experience is crucial. (I am expounding an intuition, not offering an argument.) A being is a mental being just in case it is an experiencing being; only a mental being can have mental properties. And when we ask which, if any, of the properties of a mental being, other than its experiential properties, are mental properties, the answer may be no more than a matter of convenient theoretical or terminological decision."

(Strawson, Galen. Mental Reality. 2nd ed. Cambridge, MA: MIT Press, 2009. p. 154)

Good point! Experiential properties are paradigmatic mental properties, but what kinds of nonexperiential properties are (distinctively and genuinely) mental ones (as opposed to nonmental ones)?

Consider the so-called propositional attitudes such as beliefs and desires. Can a thing inherently lacking consciousness still have nonconscious propositional attitudes? It can only if they are defined behavioristically in terms of behavioral dispositions—but what's genuinely mental about such nonconscious dispositional states?

If having a mind requires nothing more than dispositions to behavior, behavior-causing and -controlling internal processes, and forms of behavior, then having a mind is independent of having consciousness; and then it's also independent of having a brain—unless the forms of behavior are so complex and variable they need to be governed by a CPU (central processing unit), which needn't be an organic brain. (Computers and AI robots have CPUs.)

For example, plants are behavioral systems without a CPU or brain. But can they properly be said to have minds or mental properties/states? Does it make sense to ascribe (nonconscious) propositional attitudes to them (in addition to patterns of behavior)? Is "plant intelligence" a mental characteristic? What's mental about the "intelligence" of plants?
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
The scientific evidence available strongly confirms that brains are necessary for (ontologically) subjective states of organisms (= subjective experiences). There's no deductive logical proof they are, but it's by far the most plausible and most justified assumption in the light of our scientific knowledge.
This is pure BS. It is an assumption. It is also a biased assumption made by neurology and people who study the brain and humans-- not all scientific knowledge. It also assumes the study of the mind is a study of the brain -- not consciousness -- and is not evidence of exclusivity of subjective experience in the brain.
No, that assumption is not at all an expression of bias! That the brain is the natural organ (substrate/seat/source) of consciousness not only in humans but in all conscious bodies or organisms is the only reasonably plausible conclusion made on the basis of the scientific evidence coming from biology, neurology, and psychopathology (psychiatry). The scientific evidence doesn't eliminate the logical possibility of brain-independent forms of consciousness/experience, but it makes it highly implausible and improbable that there actually are such forms.
Gee wrote:
November 27th, 2019, 12:47 am
You have a point, except for the idea that we can only speculate and the "requisite organ of consciousness" part.
What nonbrains are there which might be alternative organs or physical substrates of consciousness?
Gee wrote:
November 27th, 2019, 12:47 am
Do you see the underlined above? Do you know what "survival" means? You just stated that plants will protect their selves. Their SELVES. They will behave in a way that protects the "self". This has nothing to do with sophisticated AI -- that is a strawman argument.
Plants protect themselves, not "their selves"; and their self-protecting behavior isn't evidence for their being "selves" in the sense of being subjects of experience.
Gee wrote:
November 27th, 2019, 12:47 am
After listening to comparisons of AI to consciousness over and over in another forum, I finally became frustrated and asked: How complex does AI have to get before it can compare to a blade of grass? I only got one response to that question, and that was from a microbiologist, who stated that there is no comparison. AI can not even reach the complexity of a cell, much less a whole blade of grass.
Many AI fans believe it's just a short step from artificial intelligence to artificial experience. I think they're wrong, and so does Michael Gazzaniga, one of the leading neuroscientists in the world:

"The most surprising discovery for me is that I now think we humans will never build a machine that mimics our personal consciousness. Inanimate silicon-based machines work one way, and living carbon-based systems work another. One works with a deterministic set of instructions, and the other through symbols that inherently carry some degree of uncertainty. This perspective leads to the view that the human attempt to mimic intelligence and consciousness in machines, a continuing goal of the field of AI, is doomed."

(Gazzaniga, Michael S. The Consciousness Instinct: Unraveling the Mystery of How the Brain Makes the Mind. New York: Farrar, Straus, & Giroux, 2018. p. 236)

Nonetheless, the forms of (active and reactive) behavior AI machines are capable of are getting more and more complex and more and more sophisticated. For example, meet SPOT:

Gee wrote:
November 27th, 2019, 12:47 am
Are you telling me that plants sense things objectively? Please explain how that works. One can observe objectively, but sensing is subjective.
Yes, in the psychological or phenomenological sense (which I prefer); but in the objectivistic (neuro)physiological sense, an organism's sensing consists in the (nonconscious) receiving, processing, and reacting to physical or chemical stimuli or signals.
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
My point is that visually imagining a horse is similar to visually perceiving (seeing) a (physical) horse. So-called mental images of something are simulated sense-impressions of it.
Yes it is similar, but how does that work with consciousness? How does the image go from the horse, to the eye, to the brain, and then to the mind? I think awareness actually causes a bond.
:?:
When you imagine or perceive a horse, no horse-images/-pictures are moving from the imagined/perceived horse to your brain/mind.
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
First of all, I use "sensation" solely to refer to a sort of subjective experience; so all sensations are (ontologically) subjective by definition. This is not to say that sensations aren't physicochemical processes, but only that no such process is a sensation (in the psychological/phenomenological sense of the term) unless it involves subjective sensory qualia (sensa) which are innerly felt by organisms.
So do you think sensations are outerly felt by the organisms?
I've used the adverb "innerly" to stress the special character of feelings. They are not only inner in the spatial sense of occurring inside or within organisms, but also in the nonspatial sense of what Colin McGinn calls "ontological innerness", which concerns the special external imperceptibility, privacy, and subjectivity of experiences.
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
The physicochemical mechanisms of action, reaction, interaction, and communication that we find in brainless animals, plants, protozoa, and bacteria work well without any subjective qualia/sensa. These organisms are all nonconscious "zombies".

That is a crock! You do realize that the word "zombie" actually means that it is dead -- don't you? Dead animals, plants, protozoa, and bacteria don't do anything, because they are dead. Zombies are not actually real.
By "zombie" I don't mean zombies as depicted in George Romero's movies and The Walking Dead but simply nonconscious living or nonliving agents or automata.

In the philosophy of mind, the word "zombie" is used in a special sense: https://plato.stanford.edu/entries/zombies/
Gee wrote:
November 27th, 2019, 12:47 am
Awareness is always on -- until you are dead -- then it is off for you. Even if the rational aspect of mind takes a nap, the body is still aware of the need to breathe, digest, and continue in it's work and maintenance to keep us alive.
You're talking about objective awareness that is different from subjective awareness/consciousness, because the former is nonconscious perception and processing of physical or chemical signals. I didn't say objective awareness/perception is off during a dreamless sleep.
Gee wrote:
November 27th, 2019, 12:47 am
I don't know you well enough to try to discern if you are making a strawman argument or if you are actually unaware of the difference between instincts and survival instincts. The only quote above that is relevant is the one from the Oxford Dictionary of Psychology.

I did 20 pages on the subject of instincts in a science forum with the help of a neurologist, who also worked on an AI project, an animal behaviorist, and a few other knowledgeable members. There is not much that I didn't learn about instincts, but the paramount thing that I learned is that no one actually understands them.

Survival instincts are different. We know a great deal about them, what triggers them, and how they work through hormones and pheromones. We know which parts of the body produce these chemicals, we know that the brain is sitting in a chemical bath of them, we know how individual hormones relate to individual instincts, we know that hormones can affect emotion, moods, and turn off and on different aspects of mind, and we use these chemicals to treat schizophrenia and other mental disorders. We know that monks, who are trying to reach Nirvana, often use fasting and sleep deprivation in order to succeed, and we know that fasting and sleep deprivation affect hormone levels. We also know that hormones cause homeostasis and preserve life in a life form and within a specie, but can also turn off and on the switches in DNA, which means that they could possibly influence evolution. To say that hormones and survival instincts have nothing to do with phenomenal consciousness is ludicrous.
I'm not saying that (the experiential content of) phenomenal consciousness isn't influenced by hormonal factors. My point is that the presence of instincts in general and survival instincts in particular doesn't entail the presence of phenomenal consciousness. Instinctive behavior is independent of subjective experience.

By the way, the eminent neuroscientist Michael Gazzaniga has written a book titled "The Consciousness Instinct":

"Plainly stated, I believe consciousness is an instinct. Many organisms, not just humans, come with it, ready-made. That is what instincts are, something organisms come with. Living things have an organization that allows life and ultimately consciousness to exist, even though they are made from the same materials as the non-living natural world that surrounds them. And instincts envelop organisms from bacteria to humans. Survival, sex, resilience, and walking are commonly thought to be instincts, but so, too, are more complex capacities such as language and sociality—all are instincts. The list is long, and we humans seem to have more instincts than other creatures. Yet there is something special about the consciousness instinct. It is no ordinary instinct. In fact, it seems so extraordinary that many think only we humans can lay claim to it. Even if that’s not the case, we want to know more about it. And because we all have it, we all think we have insight into it. As we will see, it is a slippery, complex instinct situated in the universe’s most impenetrable organ, the brain."

(Gazzaniga, Michael S. The Consciousness Instinct: Unraveling the Mystery of How the Brain Makes the Mind. New York: Farrar, Straus & Giroux, 2018. pp. 5-6)
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
If brains are unnecessary for consciousness, what alternative consciousness-realizing organs or organismal mechanisms are there in organisms which lack a central nervous system or even a nervous system?
I would say hormones and pheromones. Nervous systems and even the CNS are necessary to communicate within the body; they have no direct access to mind (except on the cellular level) as far as I can see -- much like computers, or "zombies". The brain itself does have access to mind, but then it is swimming in a bath of hormones. Most of what hormones do is access mind internally and externally. All species have some type/s of hormones.
So you think endocrinological processes in organisms are sufficient for (phenomenal) consciousness? Do you think glands are organs of consciousness? (Only animals have glands, plants don't. I've read every plant cell can produce hormones. Do you think each plant cell is therefore an organ of consciousness?)

Is there any scientist who takes your hypothesis seriously—that endocrinological processes constitute or produce subjective experiences (independently of neurological processes)?
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
Do you have a coherently intelligible concept of a borderline state of (phenomenal) consciousness which is neither definitely a nonexperience nor definitely an experience? – I haven't. (See my argument in a previous post of mine!)
You mistook my meaning. What I doubt is that life could start with a singular event. First consciousness (the unconscious aspect) would have to evolve enough to support life, then life would have to have evolved in a mass, rather than in a single life form.

If you want an example of something that is and is not a life form, you could consider viruses. Although viruses have DNA, they do not have survival instincts. When in a body, they will mimic survival instincts and maintain themselves, reproduce, and even evolve, just like bacteria, but outside of a body, they will just sit there like a rock. There are some theories about viruses being responsible for life starting or for the evolution of life, but I have not accepted any of them.
There are borderline cases of life such as viruses, which I don't deny. What I deny is that there are borderline cases of (phenomenal) consciousness.

Living organisms don't need consciousness to support or protect their lives. The evolution of life is prior to the evolution of consciousness!
Gee wrote:
November 27th, 2019, 12:47 am
Consul wrote:
November 19th, 2019, 7:07 pm
I don't think a cognitively totally dysfunctional mind/brain can be a conscious mind/brain.
You come perilously close to stating that the mentally handicapped can not be conscious here.
I wrote "totally dysfunctional", not "partially dysfunctional". It's certainly not the case that all mental or intellectual disabilities or impairments destroy consciousness—in fact most don't—, but some neurocognitive functions are essential to consciousness. If certain information-processing neural circuits in certain parts of the brain are disrupted (ask the experts for details!), consciousness is thereby destroyed.
Gee wrote:
November 27th, 2019, 12:47 am
Do you have any idea of what the unconscious aspect of mind is? Or how it works? It is unfortunate that we use the word, unconscious, as that makes people assume that it is NOT consciousness. Nothing could be farther from the truth. All life does not possess a rational conscious aspect of mind, but it does possess consciousness in the unconscious aspect of mind, and there is a lot of evidence to support this. To understand it, you really need to understand the unconscious.
The concept of a non-/unconscious mind or mental event/state is hard to understand.

"[W]hat makes something mental when it is not conscious?"

(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. p. 154)

I agree with Searle that…

"the only occurrent reality of the mental as mental is consciousness."

(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. p. 187)

That is to say, there is nothing distinctively and genuinely mental about nonconscious occurrences (states/events/processes). So, strictly speaking, the mind = the conscious mind (consciousness). (This is one point where I agree with Descartes.)

Of course, we do call nonconscious propositional attitudes (e.g. beliefs and desires) and corresponding behavioral dispositions or habits "mental"; but it's unclear what their alleged mentality consists in when they don't have any experiential content. Many will say that nonconscious propositional attitudes have representational contents, but what's the difference between a nonconscious mental representation and an equally nonconscious nonmental or purely neural representation in the brain? It seems that calling nonconscious occurrences mental is just a matter of arbitrary convention, because the only mental phenomena which are clearly and distinctively mental are experiential, (phenomenally) conscious ones.

Searle thinks what makes the difference between unconscious mental states (propositional attitudes) and unconscious nonmental states is "accessibility to consciousness". However, when he writes that "[w]e have no notion of the unconscious except as that which is potentially conscious," this is misleading, because an unconscious propositional attitude cannot become a conscious one. No propositional attitude is ever conscious, in the sense that there are propositional-attitude experiences such as occurrent believings (belief-experiences) or desirings (desire-experiences). There are certainly conscious thinkings or inner sayings/speakings (e.g. "I believe/desire that…"), but these are not the beliefs or desires themselves but only conscious expressions or indications of them.

———

"Let us begin by asking, naively, Do unconscious mental states really exist? How can there be a state that is literally mental and at the same time totally unconscious? Such states would lack qualitativeness and subjectivity and would not be part of the unified field of consciousness. So, in what sense, if any, would they be mental states?

I think many people, including some extremely sophisticated authors such as Freud, have the following rather simplistic picture. An unconscious mental state is exactly like a conscious mental state only minus the consciousness. The problem with this picture is that it is very hard to make any sense of it."


(Searle, John R. Mind: A Brief Introduction. New York: Oxford University Press, 2004. pp. 237-8)

"I believe that in spite of our complacency in using the concept of the unconscious, we do not have a clear notion of unconscious mental states, and my first task in clarification is to explain the relations between the unconscious and consciousness. The claim I will make can be stated in one sentence: The notion of an unconscious mental state implies accessibility to consciousness. We have no notion of the unconscious except as that which is potentially conscious.
Our naive, pretheoretical notion of an unconscious mental state is the idea of a conscious mental state minus the consciousness. But what exactly does that mean? How could we subtract the consciousness from a mental state and still have a mental state left over?"


(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. p. 152)

"At its most naive our picture is something like this: Unconscious mental states in the mind are like fish deep in the sea. The fish that we can't see underneath the surface have exactly the same shape they have when they surface. The fish don't lose their shapes by going under water. Another simile: Unconscious mental states are like objects stored in the dark attic of the mind. These objects have their shapes all along, even when you can't see them. We are tempted to smile at these simple models, but I think something like these pictures underlies our conception of unconscious mental states, and it is important to see what is right and what wrong about that conception."

(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. pp. 152-3)

"[T]he ontology of unconscious mental states, at the time they are unconscious, consists entirely in the existence of purely neurophysiological phenomena. Imagine that a man is in a sound dreamless sleep. Now, while he is in such a state it is true to say of him that he has a number of unconscious mental states. For example, he believes that Denver is the capital of Colorado, Washington is the capital of the United States, etc. But what fact about him makes it the case that he has these unconscious beliefs? Well, the only facts that could exist while he is completely unconscious are neurophysiological facts. The only things going on in his unconscious brain are sequences of neurophysiological events occurring in neuronal architectures. At the time when the states are totally unconscious, there is simply nothing there except neurophysiological states and processes."

(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. p. 159)

"The overall picture that emerges is this. There is nothing going on in my brain but neurophysiological processes, some conscious, some unconscious. Of the unconscious neurophysiological processes, some are mental and some are not. The difference between them is not in consciousness, because, by hypothesis, neither is conscious; the difference is that the mental processes are candidates for consciousness, because they are capable of causing conscious states. But that's all. All my mental life is lodged in the brain. But what in my brain is my 'mental life'? Just two things: conscious states and those neurophysiological states and processes that—given the right circumstances—are capable of generating conscious states. Let's call those states that are in principle accessible to consciousness 'shallow unconscious', and those inaccessible even in principle 'deep unconscious'. The main conclusion of this chapter so far is that there are no deep unconscious intentional states."

(Searle, John R. The Rediscovery of the Mind. Cambridge, MA: MIT Press, 1992. p. 162)
"We may philosophize well or ill, but we must philosophize." – Wilfrid Sellars

User avatar
Consul
Posts: 3328
Joined: February 21st, 2014, 6:32 am
Location: Germany

Re: Consciousness, what is and what it requires?

Post by Consul » November 28th, 2019, 1:17 am

Consul wrote:
November 27th, 2019, 9:19 pm
So you think endocrinological processes in organisms are sufficient for (phenomenal) consciousness? Do you think glands are organs of consciousness? (Only animals have glands, plants don't. I've read every plant cell can produce hormones. Do you think each plant cell is therefore an organ of consciousness?)
Descartes "regarded [the pineal gland] as the principal seat of the soul and the place in which all our thoughts are formed." He was wrong. Anyway, brainless organisms don't have a pineal gland.

Descartes and the Pineal Gland: https://plato.stanford.edu/entries/pineal-gland/
"We may philosophize well or ill, but we must philosophize." – Wilfrid Sellars

Gee
Posts: 285
Joined: December 28th, 2012, 2:41 am
Location: Michigan, US

Re: Consciousness, what is and what it requires?

Post by Gee » November 29th, 2019, 9:14 pm

Atla wrote:
November 21st, 2019, 11:52 am
Why is Western philosophy still clinging to the idea that phenomenal consciousness / qualia has something to do with life anyway? Science showed us that organisms and rocks are just arranged differently. So why so scared to question the above assumption?
I think it is because we really like to have evidence before we draw our conclusions, and, quite frankly, life is our only actual evidence of consciousness.

Gee

Gee
Posts: 285
Joined: December 28th, 2012, 2:41 am
Location: Michigan, US

Re: Consciousness, what is and what it requires?

Post by Gee » November 30th, 2019, 12:22 am

Consul wrote:
November 22nd, 2019, 11:09 am
Gee wrote:
November 21st, 2019, 6:24 pm
I don't see how you got that out of what I stated. Wait; are you still assuming a brain is required for subjective experience? No. A brain is only required in order to know you have subjective experience.
No, it's required for subjective experience.
No. Experience is analogue, just like emotion, feeling, and awareness -- which means that we do not actually know about them when they happen -- like under anesthesia. But this does not mean that the experience did not happen or that our bodies did not respond to the experience. When a surgeon cuts you open, your body will react the same if you are conscious, or if you are knocked out, because the experience is real; you, your body, will still have subjective experience.

Experience works through the unconscious aspect of mind, just like emotion, feeling, and awareness. Because it is unconscious, we do not know about it. This is what RJG has been trying to explain, that experience is necessary before we can have recognition of that experience or thoughts about that experience in the conscious aspect of mind. Information flows from the unconscious to the conscious, which the brain actually does cause. The brain is essentially analogue, but it has the ability to turn our analogue experiences into digital thought -- at that point, we call it conscious thought. This is why most people associate thought with consciousness, because that is when we actually know about it, but it is only one step in the process. Or you could call it a level of awareness.
Consul wrote:
November 22nd, 2019, 11:09 am
Gee wrote:
November 21st, 2019, 6:24 pm
I see awareness, consciousness, sentience, perceiving, sensing, and any other description of consciousness in a life form as all being different levels of the same thing -- consciousness. I also see all of these categories in life forms as all having a subjective "self". A subjective self is required in order for it to be a life form.
The noun "self" is a terrible word. I don't like it. If selves are subjects, then let's use "subject" instead!
I don't think that we can reduce the word "self" to subject. Self is a perspective, whereas subject, as you are using it, is an object. Not the same thing at all.
Consul wrote:
November 22nd, 2019, 11:09 am
There's no justification for equating life (Leben in German) with conscious, subjective life (Erleben in German). Living objects (organisms) aren't per se experiencing objects = subjects.

I disagree. On the other hand, when the "living objects" die, then I can agree that they are no longer "experiencing objects".
Consul wrote:
November 22nd, 2019, 11:09 am
Gee wrote:
November 21st, 2019, 6:24 pm
This was very interesting right up to the point where I started underlining. At that point the researchers made the assumption that a brain was necessary for mind and for "self", which causes subjective experience.
That conscious life requires a special organ—viz. the brain—is a scientifically well-justified assumption.

The brain causes consciousness in the sense that I stated above. The brain does not cause mind -- at best it causes the rational aspect of mind, which I explained earlier is more a reflection of consciousness. Although the brain can cause a sense of "self" when it reflects consciousness and shows us the rational aspect of mind, but this is an illusion -- a reflection. The brain does not cause self; it can't because self sources from the unconscious.
Consul wrote:
November 22nd, 2019, 11:09 am
Whether it's necessary for having a "mind" or a "self" depends on the definition of these terms.
Which means that we need to have a good understanding of "mind" and "self" in order to discuss this.
Consul wrote:
November 22nd, 2019, 11:09 am
For example, immunologists speak of "immune selfhood", which has nothing to do with psychological subjecthood. (See Philosophy of Immunology!) And if minds are defined in purely functional terms as physiological/physical input-output/stimulus-response mechanisms, then they are independent of brains.
Consider this: When studying law, I learned about self-defense. You can claim self-defense when protecting your actual self, your spouse, your children, and/or your home, and it will be accepted as a defense in most US Courts even when you kill. But you can not routinely claim self-defense when protecting your parents, your siblings or other relatives, friends, or your business. Why is that? Do your spouse, children, and home constitute a larger psychological self? They certainly have nothing to do with "subjecthood".

It is interesting to note that preservation of your spouse, children, and home are all covered under survival instincts and regulated by hormones. All are necessary for the continuation of the "self".

Gee

Gee
Posts: 285
Joined: December 28th, 2012, 2:41 am
Location: Michigan, US

Re: Consciousness, what is and what it requires?

Post by Gee » November 30th, 2019, 12:25 am

Consul wrote:
November 22nd, 2019, 11:17 am
Gee wrote:
November 21st, 2019, 7:02 pm
Well, I usually like Chalmers and respect his opinions, but I have to ask a question first. "The Conscious Mind: In Search of a Fundamental Theory" is about what? Is it about human consciousness/animal consciousness, or is it about conscious life?
Chalmers' book is about consciousness and "the hard problem" (p. xii) of how "consciousness arises from physical systems such as brains." (p. xi)
Then it is about brains and the rational aspect of mind as I explained earlier. I am not sure it is relative to this thread.

Gee

Gee
Posts: 285
Joined: December 28th, 2012, 2:41 am
Location: Michigan, US

Re: Consciousness, what is and what it requires?

Post by Gee » November 30th, 2019, 12:28 am

Consul wrote:
November 22nd, 2019, 11:22 am
Gee wrote:
November 21st, 2019, 6:42 pm
But the topic of consciousness is vast and much more complex than most people ever consider. We constantly try to reduce the idea to something that we can understand; hence, all of the "ists" and "isms" that have evolved to try to explain consciousness. There is room for most of these "isms" in a comprehensive theory of consciousness, even dualism and spiritualism, but you have to delve into the unconscious aspect of mind, "self", and bonding, in order to try to make sense of it.
Strictly speaking, I think the only distinctively and genuinely mental phenomena are experiential phenomena, such that mind = conscious mind/consciousness.
Are you including emotion and awareness in this? Because emotion and awareness work through the unconscious mind.

Gee

Atla
Posts: 986
Joined: January 30th, 2018, 1:18 pm

Re: Consciousness, what is and what it requires?

Post by Atla » November 30th, 2019, 2:14 am

Gee wrote:
November 29th, 2019, 9:14 pm
Atla wrote:
November 21st, 2019, 11:52 am
Why is Western philosophy still clinging to the idea that phenomenal consciousness / qualia has something to do with life anyway? Science showed us that organisms and rocks are just arranged differently. So why so scared to question the above assumption?
I think it is because we really like to have evidence before we draw our conclusions, and, quite frankly, life is our only actual evidence of consciousness.

Gee
There is literally zero evidence that links like to phenomenal consciousness. It's just that non-living things can't talk about it. So what evidence are you talking about quite frankly?

User avatar
Consul
Posts: 3328
Joined: February 21st, 2014, 6:32 am
Location: Germany

Re: Consciousness, what is and what it requires?

Post by Consul » November 30th, 2019, 2:49 am

Atla wrote:
November 30th, 2019, 2:14 am
There is literally zero evidence that links like to phenomenal consciousness. It's just that non-living things can't talk about it. So what evidence are you talking about quite frankly?
What evidence for nonbiological consciousness are you talking about?!
"We may philosophize well or ill, but we must philosophize." – Wilfrid Sellars

Atla
Posts: 986
Joined: January 30th, 2018, 1:18 pm

Re: Consciousness, what is and what it requires?

Post by Atla » November 30th, 2019, 2:56 am

Consul wrote:
November 30th, 2019, 2:49 am
Atla wrote:
November 30th, 2019, 2:14 am
There is literally zero evidence that links like to phenomenal consciousness. It's just that non-living things can't talk about it. So what evidence are you talking about quite frankly?
What evidence for nonbiological consciousness are you talking about?!
At least try to ask a relevant question.

BigBango
Posts: 343
Joined: March 15th, 2018, 6:15 pm

Re: Consciousness, what is and what it requires?

Post by BigBango » November 30th, 2019, 8:10 am

Gee wrote:
November 30th, 2019, 12:28 am
Consul wrote:
November 22nd, 2019, 11:22 am


Strictly speaking, I think the only distinctively and genuinely mental phenomena are experiential phenomena, such that mind = conscious mind/consciousness.
Are you including emotion and awareness in this? Because emotion and awareness work through the unconscious mind.

Gee
Gee, thank you for bringing up a deeper level of analysis to this discussion. Consul offers very strong arguments and is able to support his theories with a good familiarity of the literature that supports his materialism. He is a significant contributor to this forum. In my opinion you, Gee, have challenged his beliefs by bringing into the discussion the mind vs. the brain and the "unconscious" sources of emotion and awareness. We need to pay attention to Freud's theories and Jung's collective unconscious.

In addition to Freud and Jung I would recommend some attention to Whitehead. Whitehead is a pan-psychic who would support Atla's theories about rocks. Personally, I think Whitehead was wrong about his assertion that the nature of all objects are the result of the habits or preferences of underlying conscious entities. In my world view I see galaxies as the primal existent. It is simply a feature of galaxies, as they age, to separate the objectively physical from the valuing planetary ecosystems. These ecosystem evolutions are driven by its attachments to the survival of its tribes and the development of the technology to avoid black hole death.

Of course there is a big difference between the galaxies of our familiar world and the galaxies that preceded the BC/BB. The galaxies of our world are young. In the pre BC/BB world the galaxies were old, with technologically advanced galactic civilizations. After the collapse, BC/BB, the galactic civilizations that had escaped annihilation found themselves in a world where their former galaxies had been crushed into plasma and only gradually reformed molecules as it all cooled. I have calculated what the size of those pre BC/BB civilizations are and they are around the size of a Planck volume.

These former, pre BC/BB galactic civilizations, with their advanced technology, are our soul, the creators of conscious life in our world. They are both conscious to begin with, having a history that stretches back to the beginning of their world and they work to instantiate their consciousness in the new world they find themselves in after the BC/BB.

Yes, there is so much speculation in this theory that one wants to simply reject it on those grounds and never even try to verify it.

My position is that it is the best theory going to solve the "explanatory gap" and the origins of conscious life in our world. The philosophical community does not judge a theory of "mind" based its accuracy, but judges its veracity on the simplicity of the theory.

User avatar
lucky777
New Trial Member
Posts: 2
Joined: November 29th, 2019, 5:46 am

Re: Consciousness, what is and what it requires?

Post by lucky777 » November 30th, 2019, 8:20 am

The March Philosophy Book of the Month is Final Notice by Van Fleisher. Discuss Final Notice now.

The April Philosophy Book of the Month is The Unbound Soul by Richard L. Haight. Discuss The Unbound Soul Now

The May Philosophy Book of the Month is Misreading Judas by Robert Wahler.

Post Reply