Could this be an incisive definition for consciousness?

Discuss any topics related to metaphysics (the philosophical study of the principles of reality) or epistemology (the philosophical study of knowledge) in this forum.
Post Reply
User avatar
Papus79
Posts: 121
Joined: February 19th, 2017, 6:59 pm

Could this be an incisive definition for consciousness?

Post by Papus79 » September 17th, 2017, 5:39 am

I'll throw this out there as it hit me, I'm not sure whether on dumb luck or whatever else, but I think it could be a useful way of honing in on the consciousness problem:

consciousness (here defined) - a set of probability fields, each with its own contingent requirements, whose value alignments overlap to achieve outcomes deemed optimal by the overlapping set in common.


systemic consciousness - a self-aware system's sum total collection of means necessary for its own survival

consciousness as a unit in the universe - a single probability field.


It would probably take a fair amount of further inspection to see if that lines up appropriately with the current state of of the art in neurology but I think, at least when considering consciousness reflecting back on itself, this might be about as good a description as any and it seems to tie together goal-orientation (suffering vs. reward) with systemic balance.

User avatar
RJG
Posts: 836
Joined: March 28th, 2012, 8:52 pm

Re: Could this be an incisive definition for consciousness?

Post by RJG » September 17th, 2017, 9:37 am

Papus, I think the definition/understanding is much more simpler. Here is my rendition:

  • Consciousness is the singular experience of recognition, made possible by memory. (...that's it!!)
    • Without 'memory', there can be nothing to recognize; no recognition.
      Without 'recognition', there is nothing to know.
      Without something to 'know', there is no consciousness.

User avatar
The Beast
Posts: 745
Joined: July 7th, 2013, 10:32 pm

Re: Could this be an incisive definition for consciousness?

Post by The Beast » September 17th, 2017, 11:49 am

The underlying cardinality spectrum has the counterpart dimension. It must be that this dimension is memory. A DNA strand is memory. To describe cardinality from its origin is ambitious. Is there a two-way directionality (chemical) as the probability of outcome? That is: Dimension in its present form can manipulate (if little) the spectrum of cardinality and therefore is alive/has will. Consciousness in R3 as the observer.

User avatar
Papus79
Posts: 121
Joined: February 19th, 2017, 6:59 pm

Re: Could this be an incisive definition for consciousness?

Post by Papus79 » September 17th, 2017, 2:38 pm

RJG wrote:Papus, I think the definition/understanding is much more simpler. Here is my rendition:

  • Consciousness is the singular experience of recognition, made possible by memory. (...that's it!!)
    • Without 'memory', there can be nothing to recognize; no recognition.
      Without 'recognition', there is nothing to know.
      Without something to 'know', there is no consciousness.
All of that may very well be true, and what I'm describing could be a subset of the conditions you just described.

What got me was thinking a bit more deeply about the sensation of my own 'I' experience and the qualitative aspects of what it feels. While I don't know that most people would necessarily use this vocabulary, they might nonetheless be sympathetic to this description on reflection, it's always occurred to me a something like fire - ie. a byproduct activity rather than a something in and of itself.

While I think your model says some very valid things about its constraints perhaps it lacks the answer to 'what' it is once you have it, or that we should have it at all rather than being really high-functioning zombies with vast log reports to consult over every action. I think what I'm trying to say here is that once it does show up the 'I' experience is a compound probability management vector, ie. something almost like a sort of 'watch and wait' service that has a giant key-belt waiting to implement the right keys at the right times. Memory is a stored log of past results, sensory data shows what there is to 'know', which then raises the possibility that what I'm describing - per your model (and it would also make sense to me in this regard) - is a process that probably fits in the recognition domain.

Aside from that I know we have tendencies to throw metaphysical speculations against that - ie. whether we think in either dualism-like terms and think there could be deeper framework binders for consciousness than nervous systems, others who'd assert that there's no credible evidence for that assertion. I think the idea I'm putting forward here is finite enough not to strong-arm one view or the other, or to whatever extent that you think I may have the seeds for that sort of violation in my definition let me know and I can consider tweaking some of my descriptions. The reason that I didn't say something like, for unitary consciousness 'a probability field existing in a place where memory, recognition, and objects in relationship exist' was that I wanted to keep it simple and people can bring those requirements to the table for further contextualizing.

-- Updated September 17th, 2017, 2:59 pm to add the following --
The Beast wrote:The underlying cardinality spectrum has the counterpart dimension. It must be that this dimension is memory. A DNA strand is memory. To describe cardinality from its origin is ambitious. Is there a two-way directionality (chemical) as the probability of outcome? That is: Dimension in its present form can manipulate (if little) the spectrum of cardinality and therefore is alive/has will. Consciousness in R3 as the observer.
I fully agree with dna being a big part of this - ie. the optioning process is bound to very tight constraints and those constraints are the accrued rules of one's genes.

To the second question 'Is there a two-way directionality (chemical) as the probability of outcome?' - that's a very good question, one that I've asked myself, but I think that one runs up against the current limits in the state of the art for neurology, biology, and neurochemistry. That does raise another question for me as well - ie. why don't I sense every chemical in every place in my being if my consciousness self-awareness is a management tool that presides over all probabilities in my system? I would have to default back to neurology and other fields on this simply to say that there are a lot of processes that we don't experience, the glut of information would paralyze us in functional domains that are critical to our survival and well-being, and so there are deeper (subconscious) processes which set definite criteria as to what relationships are considered for self-conscious/self-aware management and which aren't. This is where we might also be looking at a model where we have a certain region of the brain that's capable of running such an optioning or wait-and-hold service but that there's a current hard limit on how much of that process it can run, which accounts for the parsimony in terms of what's allowed to be in front of self-aware cognition and what isn't. Interestingly, to RJG's touchstones, I do remember hearing a fair amount about a circulation of activity between the hypothalamus, visual cortex, and prefrontal cortex - that at least seems to at least be a big part of what compiles the 'stuff' of consciousness, what I'm more concerned with here however is the pilot-light itself.

User avatar
Ranvier
Posts: 538
Joined: February 12th, 2017, 1:47 pm
Location: USA

Re: Could this be an incisive definition for consciousness?

Post by Ranvier » September 17th, 2017, 5:23 pm

RJG wrote:
  • Consciousness is the singular experience of recognition, made possible by memory. (...that's it!!)
    • 1. Without 'memory', there can be nothing to recognize; no recognition.
      2. Without 'recognition', there is nothing to know.
      3. Without something to 'know', there is no consciousness.
I would disagree with all three assertions, here is why:

1. This is deceiving because we're born "blank" but learn starting at some point of conception and birth. However, in an adult brain that suffered a complete amnesia (including speech and manual memory), consciousness will be able to differentiate "self" from everything else as "I". Therefore, such capability to "recognize" or learn, is inherent to the conscious mind, albeit blank at such moment.

Of course I realize that there are two meanings to "recognize"

recognize
[ˈrekəɡˌnīz]
VERB

- identify (someone or something) from having encountered them before; know again:
"I recognized her when her wig fell off" · [more]
identify from knowledge of appearance or character:
"Pat is very good at recognizing wildflowers"
synonyms: identify · place · know · put a name to · remember · recall · [more]
(of a computer or other machine) automatically identify and respond correctly to (a sound, printed character, etc.).

- acknowledge the existence, validity, or legality of:
"the defense is recognized in Mexican law" · [more]
synonyms: acknowledge · accept · admit · realize · be aware of · be conscious of · [more]

You are correct in this point, when considering only the first meaning of the definition for "recognize"

2. There is something to "know" as soon as the consciousness is "awaken", in ability to acquire and process data input as it arrives from the senses. The human mind may not be conscious of the difference or implication in what is perceived but certainly it can "recognize" and difference between colors or learn quickly to "recognize" the depth perception. There is always something to "know" in learning, regardless of previous memories. If what you assert was true, children would never learn anything.

3. That's also incorrect, in my opinion. We can isolate someone from any sensory input and that individual will maintain conscious with intact consciousness. However, I do understand what you convey as "absence" of conscious thought if one where to be born without any ability of perception. The consciousness would be "there" but the conscious thought would be difficult to imagine without any prior information to "manipulate" in the mind.

User avatar
Greta
Site Admin
Posts: 5546
Joined: December 16th, 2013, 9:05 pm

Re: Could this be an incisive definition for consciousness?

Post by Greta » September 17th, 2017, 6:15 pm

Papus79 wrote:... a set of probability fields ...
Papus, an immediate issue. Probability fields are mathematical models, not actual realities. Electrons are said to exist in a "probability field" but they don't because probability fields don't exist. You can also be modelled as a probability field; at any given moment you may be here or there, facing this way or that, accelerating, slowing, spinning (well, turning) etc. Your probability field could be calculated, but it would not be you :)

User avatar
Papus79
Posts: 121
Joined: February 19th, 2017, 6:59 pm

Re: Could this be an incisive definition for consciousness?

Post by Papus79 » September 17th, 2017, 7:12 pm

Greta wrote:
Papus79 wrote:... a set of probability fields ...
Papus, an immediate issue. Probability fields are mathematical models, not actual realities. Electrons are said to exist in a "probability field" but they don't because probability fields don't exist.
That's the intuitive 'feel' of life - ie. that probabilities should be none other than our inability to capture data accurately or give detailed reports, so to say 80% chance of rain on Tuesday in a given area means little in the sense that it certainly will rain in certain areas, not others, each place in various degrees, and the raining or not raining thing is a clear thing. We're used to seeing that and therefore we're used to seeing probabilities as highly pre-digested material that's fed to us for public consumption by such authorities.

Something that shed some light on this - I was watching Sean Carroll on Joe Rogan a few days back and he was talking about the electron shells of atoms in a very different way from what you're describing. While I'd admit that I'm not sure of what to make of his blanket dismissal of 'particles popping in and out of existence', ie. that very well could be the case in vacuum turbulence or at least it may operate by very different mechanics than the locality of an electron, Sean was completely unapologetic in stating that the current understanding of physics tells us that the natural state of an electron is a field, not a point in a given location; that's what happens when it entangles with another system, not what happens when its in its unperturbed state. He lent similar explanations to the double-slit experiment, ie. that the natural state of light is a wave, it only becomes a particle when interacted with.

I think there is credible evidence to suggest that he's right on those accounts, ie. that the 'probability field' state is a real thing. When we assume, intuitively, that it's just a plug figure for bad or highly incomplete mathematical modeling, we're conflating perhaps very different things.
Greta wrote:You can also be modelled as a probability field; at any given moment you may be here or there, facing this way or that, accelerating, slowing, spinning (well, turning) etc. Your probability field could be calculated, but it would not be you :)
And that's just it - I ultimately choose to do something. There are many different models of probability collapse. While I think Sean's 'many worlds hypothesis' could have parsimonious modeling if we set up possible outcomes as cross-sections of waves themselves (rather than absurd quantities of new universes boiling off every plank second we simply have another access of migration) I still think there's an underlying game being played here which is our seeming desire to amputate any actual agency that consciousness has in the universe. When we go that route we end up with unsolvable problems and a menagerie of equally bizarre models of QM in response to the measurement problem that all fit the assumptions but veer off significantly. This is where I have to suggest that Penrose and Hameroff might be on the right track as well as those who do insist that probability does actually collapse based on certain principles of parsimony.

One thing we could research here - as a related issue - is the current state of quantum computing. Apparently older quantum computers were built relying on parsimony collapse but that was seen to be a dead end. There are now firms out there, within the past year, who suggest that they've been able to create properly gated systems at the quantum levels where the rule sets could be directly interacted with. I think that might help with your questions about probabilities.

-- Updated September 17th, 2017, 7:20 pm to add the following --

One quick correction - I meant *axis with my criticism of universal creation by divergence, not access.

Also yes, to reiterate I am suggesting, and I think I'll stand behind it for a while until or unless other evidence comes in contrary, that to have consciousness or an 'I exist' feeling is your merged state vector of probability fields being carefully regulated by a region in the brain which handles the use of that pilot light, and it is - to me at least - very strong indication that there is quantum computing going on in at least what feels like the prefrontal cortex but it could be much broader and only identified there based on very judicious filtering by the amygdala and other reality-collapsing offices of the human nervous system.

User avatar
The Beast
Posts: 745
Joined: July 7th, 2013, 10:32 pm

Re: Could this be an incisive definition for consciousness?

Post by The Beast » September 17th, 2017, 8:10 pm

the additional involvement of frontal regions may be rather specific to areas of control. My hypothesis is one of virtual particles happening say in the IPS. Who (the self) or what is next in the loop is a serious question. However, it is controlled by the flow of blood to the unrelated areas (AKA the mystery). At the edge of the sulcus is the correspondent. Where else could it be?... Anyway, virtual and metaphysical. It must be wave phase… virtual to believe.

User avatar
Papus79
Posts: 121
Joined: February 19th, 2017, 6:59 pm

Re: Could this be an incisive definition for consciousness?

Post by Papus79 » September 17th, 2017, 8:44 pm

Considering the kinds of probability fields I'm considering and their aggregation it's quite possible that the fields don't need to be in any place specific, really if it's too specific you couldn't have such overlap of priorities that are being handled in different parts of the brain. It's quite possible that the whole brain is lit up with that general probability field, perhaps the whole body, but that there's either just one region or a very finite set of regions acting as a nexus between the data sets which necessitates rigorous editing of the data perceived in self-aware processing in order not to fry the unit.

User avatar
RJG
Posts: 836
Joined: March 28th, 2012, 8:52 pm

Re: Could this be an incisive definition for consciousness?

Post by RJG » September 17th, 2017, 9:13 pm

RJG wrote:Consciousness is the singular experience of recognition, made possible by memory. (...that's it!!)
Ranvier wrote:
RJG wrote:1. Without 'memory', there can be nothing to recognize; no recognition.
1. This is deceiving because we're born "blank" but learn starting at some point of conception and birth…
If babies have no memories, then babies cannot recognize.

Babies develop memories through experiences. If baby experiences mama’s face enough times, then the experience of mama’s face will be etched in baby’s memory, and baby will then be able to ‘recognize’ his mama!

It is one thing to ‘experience’, and yet another to ‘know’ (recognize) our experiences. Many entities in life (including babies/adults, worms, plants, and single cell amoeba) can all ‘experience’ and auto-react accordingly. But only those entities that possess memory; and can recognize (i.e. can “know”) what they experience, are the ones that are considered ‘conscious’ subjects.
Ranvier wrote:…However, in an adult brain that suffered a complete amnesia (including speech and manual memory), consciousness will be able to differentiate "self" from everything else as "I". Therefore, such capability to "recognize" or learn, is inherent to the conscious mind, albeit blank at such moment.
Those with limited/impaired ‘memory’ will also have limited/impaired ‘recognition’.

Ranvier wrote:
RJG wrote: 2. Without 'recognition', there is nothing to know.
2. There is something to "know" as soon as the consciousness is "awaken", in ability to acquire and process data input as it arrives from the senses.
Not logically possible. How does one (or consciousness itself) identify/recognize this “something” without a means of identifying/recognizing?
Ranvier wrote:There is always something to "know" in learning, regardless of previous memories. If what you assert was true, children would never learn anything.
Not so. The “knowing” is the result of learning. Learning comes from ‘experiencing’, which develops ‘memory’, which then is available for ‘recognition’ (i.e. “knowing”).

Ranvier wrote:
RJG wrote:3. Without something to 'know', there is no consciousness.
3. That's also incorrect, in my opinion. We can isolate someone from any sensory input and that individual will maintain conscious with intact consciousness.
The stuff that we “know” comes from memory. Without (the stuff from) memory to know, there is nothing to know. If there is nothing to know, then there is nothing to know!. Therefore, we can’t "know" what our sensory inputs are telling us, we can't "know" what our thoughts are saying, we can’t "know" what our body is doing, we can’t “know” ANYTHING! …so without "knowing", there can be no consciousness.

User avatar
Ranvier
Posts: 538
Joined: February 12th, 2017, 1:47 pm
Location: USA

Re: Could this be an incisive definition for consciousness?

Post by Ranvier » September 17th, 2017, 9:49 pm

RJG

So the difference between the human mind and a squirrel, is nothing more than the amount of memories held in the brain?

User avatar
Papus79
Posts: 121
Joined: February 19th, 2017, 6:59 pm

Re: Could this be an incisive definition for consciousness?

Post by Papus79 » September 17th, 2017, 9:56 pm

RJG wrote: If babies have no memories, then babies cannot recognize.

Babies develop memories through experiences. If baby experiences mama’s face enough times, then the experience of mama’s face will be etched in baby’s memory, and baby will then be able to ‘recognize’ his mama!
I think to even get that far, and we're probably at least partially agreeing here, there's either something in the DNA itself which acts as that starting bios, or, also quite possible, the process of booting up sensory recognition is already happening somewhere in utero (terribly un-PC thought - I know) and that bios boot-strapping from the DNA could be a very simple process that happens early but compounds its results as the system begins to self-regulate according to the rule sets given.

One of the real nightmares right now in human development is cell individuation. Recursive processes making iterative passes (ie. functionalism w/ multiple realization) seems to lend a possibility where, the rule set being followed, bodies don't have to unfold or grow in a carbon-copy manner, chaos mathematics can handle blood vessel, neural, and other bifurcations, but the self-aware process governing the equilibrium and checking against the given rule set constantly makes the adjustments necessary to keep the project on track. This might also explain why psychoactive drugs can have such pernicious effects on embryonic and fetal development. All of that of course is pure speculation but this is where I think this model starts looking like it might have potential to at least offer hypothetical explanations that make sense.

User avatar
Ranvier
Posts: 538
Joined: February 12th, 2017, 1:47 pm
Location: USA

Re: Could this be an incisive definition for consciousness?

Post by Ranvier » September 17th, 2017, 10:06 pm

So yes, we're basically an acorn eating squirrel machines but with a slightly higher storage capacity, pre-programmed with the basic DNA software...

User avatar
RJG
Posts: 836
Joined: March 28th, 2012, 8:52 pm

Re: Could this be an incisive definition for consciousness?

Post by RJG » September 17th, 2017, 10:43 pm

Ranvier wrote:So the difference between the human mind and a squirrel, is nothing more than the amount of memories held in the brain?
First of all, there is no human “mind”. There is only a human brain/body that experiences the thought/notion of “mind”.

Secondly, 'memory' gives us (and squirrels) the ability to ‘recognize’ some of our experiences. Our non-memory friends, such as worms, plants, and single cell amoeba, can still 'experience' and auto-react accordingly (i.e. experientially react), but they just don’t ‘know’ it; they are not ‘conscious’ of it.

Papus79 wrote:I think to even get that far, and we're probably at least partially agreeing here, there's either something in the DNA itself which acts as that starting bios…
…seems reasonable/rational.

Ranvier wrote:So yes, we're basically an acorn eating squirrel machines but with a slightly higher storage capacity, pre-programmed with the basic DNA software...
An interesting way of putting it, …but in essence, yes!

Burning ghost
Posts: 1662
Joined: February 27th, 2016, 3:10 am

Re: Could this be an incisive definition for consciousness?

Post by Burning ghost » September 17th, 2017, 11:33 pm

Consciousness is the unified interpretive force of the brain. What that means we don't really know.

We are born with innate "knowledge". We are not blank slates.

We are nothing like silicon computers. It is a common analogy, but a limited one. Our brain are more like ecological systems than lap-tops. The forced use of analogies like "software" and "hardware" seems to have caused a great deal of confusion over time IMO.

Papus -

I kind of see where you are coming from with the definition. My issue would be that you're assuming the brain is a cluster of separate brains functioning as a unity. The more popular view, so I believe, is in "distributed consciousness". I think it would be too assumptive to say that systems overlap in an "optimal" way. I don't think all systems act toward achieving some unified "optimal" mode. What seems to be the case is various systems culminate into a hodgepodge of constantly varying degrees of influence. The most intriguing thing for me is how consciousness gives us a sense of 'authorship', and how this sense of 'authorship' feeds back into the whole system at large (and to what extent?) - so my comment here is just a simple warning away from assuming that neural systems are acting 'rationally'/'logically' under this thing we feel and know now, this living feeling consciousness that I am and you are.

You may find it interesting to study differences in hemispheres. If you look at some studies on birds (whose visual hemispheres are very much separated compared to humans) you'll see some fascinating insights into the two quite opposing views of the world given. This is why I would err away from saying what is deemed "optimal" for each system. Also, I apologise because I am struggling with finding the correct vocabulary here. That to me is probably the biggest issue we have in our confusion and perspective on the topic of consciousness.

And again, CONSCIOUSNESS is fairly well defined. We all have a very obvious understanding of it! The problem neuroscience has is applying the subjective feeling to scientific data. In some respects we have to accept that science helps us understand physical functions and patterns rather than explaining the "what" and "why". Science says "how", as best it can, and dares go no further (that is the job of the erratic human individual!)

RJG -

It is not simply a case of "memory". We have a capacity to understand and our understanding shapes, and is shaped by, the environment. We don't learn to see faces, we are born with the capacity to recognize faces, and to understand things in a spatial way. We lose neurons as those that are not put to use die off and our brains steadily refine to process the information its exposed to.

A baby, literally minutes after being born, can mimic facial expressions. Are you suggesting it does this because it 'recognises' what a face is and where its own face is and how to copy the face hovering before it? (seems unlikely considering its just opened its eyes for the first time.)

We are born sensitive to input and through experiences filter more and more what input we receive (neurons that are not put to use simply die for better efficiency - why waste the glucose?)

note: We are absolutely not "squirrel machines". Next you'll all be saying a squirrel is the same as a worm. Common features are merely common features.
AKA badgerjelly

Post Reply