Papus79 wrote: ↑December 26th, 2020, 11:38 am
Greta wrote: ↑December 26th, 2020, 5:51 am
Such extreme rationalism seems to be, to some extent, a common human ideal. When you look at those who are most admired, they are the ones that seem most "bulletproof", the most invulnerable to nervousness and lack of confidence. People aspire to be like that, to just let it all flow without fear - to just do it, so to speak.
More machinelike. There appears to be a melding ahead between intelligent biology and intelligent geology.
Well, looking back at history the violence was less technical and more hypermasculine. It's interesting to hear about the Roman centurions and get just how much their military ethics were reminiscent of those in feudal Japan. I say that because zero-sum competition and the level that it has to be played at to win tends to define the nature and flavor of violence in a culture. These days, as a martial artist, it's tough to even bring up that I do it without getting looked at weird by guys who sit behind a desk 60 hours per week - their impression is that I have no clue what's important because spending any time on self-improvement is almost like personally forfeiting the social-climbing race. The expression 'bringing a sword to a gun-fight' comes to mind here (my criticism back at them - they don't get self-integration, what it has to do with mental health, or just how much certain martial arts are great for your body and nervous system in broader ways than the fighting itself).
The larger and older societies grow, the more specialised people become, falling into fairly strict established roles. So schools no longer aim to produce rounded, well-adjusted graduates (with music, art and social studies ever more sidelined) - just specialised tools to suit particular tasks. Vale the polymath, rendered redundant by teams of specialists, having no hope of competing with such corporate sponsored output. Then the teams of specialists are studied and their methods incorporated into their even-more-efficient mechanical replacements.
Papus79 wrote: ↑December 26th, 2020, 11:38 am
As a programmer I get it drilled in often by what I do that there's no self-aware magic in what I'm programming that isn't a reflection of my own thoughts. It wouldn't be bad to humanize software and put more of own analog hum into it, social media aims for that albeit in extractive ways. User interfaces are really a lot like different pieces and parts animated against each other by fishing line (I think of observables in Angular and Typescript that way). If there is panpsychism in these systems it's so low-level that it seems akin to the panpsychism of a chair or table, which really makes be beg the question of my own favorite theory of consciousness (functionalism with multiple realizability) what forces or factors actually bind higher-level contracts to make self-organizing systems that have said aim of projecting into the future.
When it comes to the idea of weak panpsychism applied to barely sentient code, we are left wondering where the line is drawn between responsiveness and consciousness, between reflexes and feeling.
Which perhaps points out one of the larger aspects of reality that we cannot access - the subjective states of other entities a la Nagle. We are constantly surrounded by beings whose subjective reality is inaccessible. We can infer based on circumstantial evidence but the question "What's it like to be a [whatever]" remains unanswered, despite recent gains in neuroscience and understanding of other animals' systematics.
Papus79 wrote: ↑December 26th, 2020, 11:38 amTBH if people actually want to be bulletproof in the way of being emotionless, mechanical, full logic, having Cliffhanger nerves of steel, that's a great reflection of the current competitive environment but the question arises - what will those people do when our technology gets far enough advanced, possibly in the next several decades, that human competition is even frustrated because what's happening is literally too far above our heads to have most human capability enter in the race? I don't know if you ever watch or listen to Isaac Arthur's channel on Youtube, he's a physicist whose gotten really big on making space colonization videos but he's also chewed on the question (generally in an optimistic way) of where we end up in a hundred years or so when our technology really starts changing the fundamental game - his idea is that our biggest battles will be with redundancy and boredom, ie. John Vervaeke's run at the meaning crisis will be even more salient to most people then than it will be now.
I think in that last context, when most human work has been made obsolete, if our capacities for damaging each other through avarice is thwarted by the technology (ie. even making the uber-rich superfluous), then we have to actually work in a very different direction - ie. on enriching the color depth of experience. It would be an environment where someone whose honed themselves to be corporate killing machines would be incredibly uncomfortable because they'd have little application for their tools and they'd most likely become the butt of jokes as people who couldn't adapt to the new environment.
In the end, the desire to be invulnerable is the desire to feel less, to filter out more - to be less able to access reality. Of course, without massive filtering of actual reality, we would deluged by sensory data and unable to focus. So we have evolved to only perceive a greatly simplified model of reality, hence the thread's question.
Sensitivity is advantageous when life is good and insensitivity helps in times of privation and peril. First we need to create decent, pleasant societies if we are to enrich the depth of our life experience. I'm thinking that humanity's best hope for happiness is digitisation of their minds, where minds can grow and be entertained without the need to kill or compete with other life forms (including human ones). As human and machine machine interdependencies increase, it would be interesting to see where it ends up in a thousand years' time!