The most reasonable conclusion to this discussion is that AI cannot create anything that the engineers, the humans, do not include within its parameters. Its limitations come by the inherent nature of its design. There’s no evidence that designers can overcome this limitation.
Debate with ChatGPT
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
― Marcus Tullius Cicero
- Sy Borg
- Site Admin
- Posts: 15148
- Joined: December 16th, 2013, 9:05 pm
Re: Debate with ChatGPT
Why would computers be the only domain of reality in the entire universe that is not potentially subject to synergies and emergences over time with complexification? They are already capable of unexpected results and, certainly, AI can create code that human programmers cannot work out.Count Lucanor wrote: ↑January 18th, 2023, 8:06 amThe most reasonable conclusion to this discussion is that AI cannot create anything that the engineers, the humans, do not include within its parameters. Its limitations come by the inherent nature of its design. There’s no evidence that designers can overcome this limitation.
There's no evidence that human designers can overcome current limitations, but there's no evidence that AI itself will fail to progress, or fail to ever build other AIs.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
Computers are not organisms, their level of autonomy is close to zero. Their potential is inevitably determined by the potential of human beings to overcome the inherent design limitations of machines. If there is no evidence of an actual accomplishment, then that’s it, saying that the openness of the future guarantees that it will be accomplished is simply wrong. There’s no evidence that humans will be able to travel to speeds higher than the speed of light; it does not mean that possibility is forever closed to humans, but for now, we cannot assert that it will happen.Sy Borg wrote: ↑January 18th, 2023, 4:51 pmWhy would computers be the only domain of reality in the entire universe that is not potentially subject to synergies and emergences over time with complexification? They are already capable of unexpected results and, certainly, AI can create code that human programmers cannot work out.Count Lucanor wrote: ↑January 18th, 2023, 8:06 amThe most reasonable conclusion to this discussion is that AI cannot create anything that the engineers, the humans, do not include within its parameters. Its limitations come by the inherent nature of its design. There’s no evidence that designers can overcome this limitation.
There's no evidence that human designers can overcome current limitations, but there's no evidence that AI itself will fail to progress, or fail to ever build other AIs.
― Marcus Tullius Cicero
- Sy Borg
- Site Admin
- Posts: 15148
- Joined: December 16th, 2013, 9:05 pm
Re: Debate with ChatGPT
"Computers are not organisms". Most enlightening. Did you know that stars, planets, rocks and oceans all complexify and metamorphose over time too, not just biology? Entities do not necessarily need the mutation of long-chain nucleotides to change over time. It's not (technically) called "evolution" but geologic and plasma entities certainly do evolve, eg. the Earth's geology (even before plastic) today is far different to that of the planet's Hadean Era.Count Lucanor wrote: ↑January 19th, 2023, 11:34 amComputers are not organisms, their level of autonomy is close to zero. Their potential is inevitably determined by the potential of human beings to overcome the inherent design limitations of machines. If there is no evidence of an actual accomplishment, then that’s it, saying that the openness of the future guarantees that it will be accomplished is simply wrong. There’s no evidence that humans will be able to travel to speeds higher than the speed of light; it does not mean that possibility is forever closed to humans, but for now, we cannot assert that it will happen.Sy Borg wrote: ↑January 18th, 2023, 4:51 pmWhy would computers be the only domain of reality in the entire universe that is not potentially subject to synergies and emergences over time with complexification? They are already capable of unexpected results and, certainly, AI can create code that human programmers cannot work out.Count Lucanor wrote: ↑January 18th, 2023, 8:06 amThe most reasonable conclusion to this discussion is that AI cannot create anything that the engineers, the humans, do not include within its parameters. Its limitations come by the inherent nature of its design. There’s no evidence that designers can overcome this limitation.
There's no evidence that human designers can overcome current limitations, but there's no evidence that AI itself will fail to progress, or fail to ever build other AIs.
There is, of course, zero doubt that AI will move FAR beyond its current limits, just as it has already progressed from mere ATMs and supermarket checkouts. What's unknown are the nature and tempo of the changes. Whether humans or AI make the changes is moot, what matters is the changes are happening.
If you think that all people working on AI will be responsible and not imbue it with increasing autonomy then you have not paid attention to human history. The complexification I spoke about WILL happen - and it will not always be prosaic, "harmless" changes when Chinese and Russia researchers compete for the lead in what promises to be by far the most powerful technology ever known. They have already agreed that there will be "no limits" to what they will do to defeat the west. Everything is on the table, including nukes, germ warfare, hacking and AI drone attacks.
AI will be granted ever more autonomy because that will make it more useful and effective, allowing it do do things that we cannot.
-
- Posts: 712
- Joined: February 6th, 2021, 5:27 am
Re: Debate with ChatGPT
Once a critical degree of complexity becomes active through a continuous compounding of itself, that is exactly what it can become...an organism supervised by its own processes which is how pretty much everything else got started, including us.Count Lucanor wrote: ↑January 19th, 2023, 11:34 am
Computers are not organisms, their level of autonomy is close to zero.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
No kidding!! Everything changes, nothing remains the same. Nothing in my posts in this discussion suggests the opposite. I didn’t even say anything about computer technology not evolving, in fact I said that they have had and will have an enormous influence in society. I simply don’t share the optimism of those who believe that technology evolves autonomously, the way organisms do, and that AI technology is on its way to acquiring any level of consciousness. The evolution of life has very little in common with the evolution of man-made tools.Sy Borg wrote: ↑January 19th, 2023, 3:14 pm"Computers are not organisms". Most enlightening. Did you know that stars, planets, rocks and oceans all complexify and metamorphose over time too, not just biology? Entities do not necessarily need the mutation of long-chain nucleotides to change over time.Count Lucanor wrote: ↑January 19th, 2023, 11:34 amComputers are not organisms, their level of autonomy is close to zero. Their potential is inevitably determined by the potential of human beings to overcome the inherent design limitations of machines. If there is no evidence of an actual accomplishment, then that’s it, saying that the openness of the future guarantees that it will be accomplished is simply wrong. There’s no evidence that humans will be able to travel to speeds higher than the speed of light; it does not mean that possibility is forever closed to humans, but for now, we cannot assert that it will happen.Sy Borg wrote: ↑January 18th, 2023, 4:51 pmWhy would computers be the only domain of reality in the entire universe that is not potentially subject to synergies and emergences over time with complexification? They are already capable of unexpected results and, certainly, AI can create code that human programmers cannot work out.Count Lucanor wrote: ↑January 18th, 2023, 8:06 am
The most reasonable conclusion to this discussion is that AI cannot create anything that the engineers, the humans, do not include within its parameters. Its limitations come by the inherent nature of its design. There’s no evidence that designers can overcome this limitation.
There's no evidence that human designers can overcome current limitations, but there's no evidence that AI itself will fail to progress, or fail to ever build other AIs.
Technically speaking, geological change can be called evolution. It cannot be called biological evolution, though.
Nonsense. Different processes, different causes, different potentiality of changes. It does matter.Sy Borg wrote: ↑January 19th, 2023, 3:14 pm There is, of course, zero doubt that AI will move FAR beyond its current limits, just as it has already progressed from mere ATMs and supermarket checkouts. What's unknown are the nature and tempo of the changes. Whether humans or AI make the changes is moot, what matters is the changes are happening.
For the purpose of this discussion, it really matters very little what people working on AI want to do. I’m discussing what they CAN effectively accomplish, and giving consciousness to machines is nothing they are close to achieve.Sy Borg wrote: ↑January 19th, 2023, 3:14 pm If you think that all people working on AI will be responsible and not imbue it with increasing autonomy then you have not paid attention to human history. The complexification I spoke about WILL happen - and it will not always be prosaic, "harmless" changes when Chinese and Russia researchers compete for the lead in what promises to be by far the most powerful technology ever known. They have already agreed that there will be "no limits" to what they will do to defeat the west. Everything is on the table, including nukes, germ warfare, hacking and AI drone attacks.
Granted? That doesn’t seem the most appropriate word. You cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet. A Tesla car is pretty “autonomous”, too, but it will not move to any place that a human doesn’t want it to be.
― Marcus Tullius Cicero
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
More of the technological myth that AI enthusiasts embrace with more wishful thinking than actual evidence. No, computers are not organisms.Tegularius wrote: ↑January 21st, 2023, 10:26 pmOnce a critical degree of complexity becomes active through a continuous compounding of itself, that is exactly what it can become...an organism supervised by its own processes which is how pretty much everything else got started, including us.Count Lucanor wrote: ↑January 19th, 2023, 11:34 am
Computers are not organisms, their level of autonomy is close to zero.
― Marcus Tullius Cicero
-
- Posts: 377
- Joined: December 17th, 2013, 6:36 pm
Re: Debate with ChatGPT
I think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
You're endorsing the computational theory of mind, which equates the brain with a computer. It is not.Mercury wrote: ↑January 23rd, 2023, 11:28 pmI think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.
The Brain is Neither a Neural Network Nor a Computer
Underlying much of artificial intelligence research is what Alan Jasanoff calls the cerebral mystique— the idea that the essence of an individual resides in the brain. In The Biological Mind, he argues that this idea neglects the fundamental lesson of neuroscience. The brain is a biological organ embedded in a physical environment. A brain cannot function independently from the body and its surrounding world. Dismantling this myth will allow us to understand what is reasonable to expect from artificial intelligence, as well as technology designed to improve human life.
Although Jasanoff’s book argues from a neuroscience perspective, it supports the idea of embodied cognition—our cognition does not just come from the brain, but from the interaction of a particular being with its environment. Some refer to this as an “ecological approach.” The brain does not make a computation to process sensory input into actions. The brain uses perception and action to link the body to the environment.
― Marcus Tullius Cicero
-
- Posts: 377
- Joined: December 17th, 2013, 6:36 pm
Re: Debate with ChatGPT
Mercury wrote: ↑January 23rd, 2023, 11:28 pmI think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.
I'd rather debate with you than with a book neither of us have read. I know I haven't read it, and it doesn't seem you have either because the passage you quote is irrelevant to the argument I'm making. I haven't said anything about what the brain is or is not. What I'm saying is that consciousness can be hosted on platforms other than the biological, because consciousness is software. i.e. the ghost in the machine is equivalent to a complex computer program.Count Lucanor wrote: ↑January 24th, 2023, 12:27 amYou're endorsing the computational theory of mind, which equates the brain with a computer. It is not.
The Brain is Neither a Neural Network Nor a Computer
Underlying much of artificial intelligence research is what Alan Jasanoff calls the cerebral mystique— the idea that the essence of an individual resides in the brain. In The Biological Mind, he argues that this idea neglects the fundamental lesson of neuroscience. The brain is a biological organ embedded in a physical environment. A brain cannot function independently from the body and its surrounding world. Dismantling this myth will allow us to understand what is reasonable to expect from artificial intelligence, as well as technology designed to improve human life.
Although Jasanoff’s book argues from a neuroscience perspective, it supports the idea of embodied cognition—our cognition does not just come from the brain, but from the interaction of a particular being with its environment. Some refer to this as an “ecological approach.” The brain does not make a computation to process sensory input into actions. The brain uses perception and action to link the body to the environment.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
If you really wanted to debate you could have made a little effort to read what's being presented as an argument, so that I didn't have to repeat myself. I told you that you were endorsing the computational theory of mind, which equates the brain with a computer, and I gave you a link that specifically deals with that issue. After you ignored it, you come now to say exactly the same, equating consciousness with software (for the brain hardware), as if that was not the computational theory of mind.Mercury wrote: ↑January 24th, 2023, 1:25 amMercury wrote: ↑January 23rd, 2023, 11:28 pmI think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.I'd rather debate with you than with a book neither of us have read. I know I haven't read it, and it doesn't seem you have either because the passage you quote is irrelevant to the argument I'm making. I haven't said anything about what the brain is or is not. What I'm saying is that consciousness can be hosted on platforms other than the biological, because consciousness is software. i.e. the ghost in the machine is equivalent to a complex computer program.Count Lucanor wrote: ↑January 24th, 2023, 12:27 amYou're endorsing the computational theory of mind, which equates the brain with a computer. It is not.
The Brain is Neither a Neural Network Nor a Computer
Underlying much of artificial intelligence research is what Alan Jasanoff calls the cerebral mystique— the idea that the essence of an individual resides in the brain. In The Biological Mind, he argues that this idea neglects the fundamental lesson of neuroscience. The brain is a biological organ embedded in a physical environment. A brain cannot function independently from the body and its surrounding world. Dismantling this myth will allow us to understand what is reasonable to expect from artificial intelligence, as well as technology designed to improve human life.
Although Jasanoff’s book argues from a neuroscience perspective, it supports the idea of embodied cognition—our cognition does not just come from the brain, but from the interaction of a particular being with its environment. Some refer to this as an “ecological approach.” The brain does not make a computation to process sensory input into actions. The brain uses perception and action to link the body to the environment.
― Marcus Tullius Cicero
-
- Posts: 377
- Joined: December 17th, 2013, 6:36 pm
Re: Debate with ChatGPT
Mercury wrote: ↑January 23rd, 2023, 11:28 pmCount Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.Mercury wrote: ↑January 24th, 2023, 1:25 amI think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.Count Lucanor wrote: ↑January 24th, 2023, 12:27 amYou're endorsing the computational theory of mind, which equates the brain with a computer. It is not.
The Brain is Neither a Neural Network Nor a Computer
Underlying much of artificial intelligence research is what Alan Jasanoff calls the cerebral mystique— the idea that the essence of an individual resides in the brain. In The Biological Mind, he argues that this idea neglects the fundamental lesson of neuroscience. The brain is a biological organ embedded in a physical environment. A brain cannot function independently from the body and its surrounding world. Dismantling this myth will allow us to understand what is reasonable to expect from artificial intelligence, as well as technology designed to improve human life.
Although Jasanoff’s book argues from a neuroscience perspective, it supports the idea of embodied cognition—our cognition does not just come from the brain, but from the interaction of a particular being with its environment. Some refer to this as an “ecological approach.” The brain does not make a computation to process sensory input into actions. The brain uses perception and action to link the body to the environment.Mercury wrote: ↑January 24th, 2023, 1:25 amI'd rather debate with you than with a book neither of us have read. I know I haven't read it, and it doesn't seem you have either because the passage you quote is irrelevant to the argument I'm making. I haven't said anything about what the brain is or is not. What I'm saying is that consciousness can be hosted on platforms other than the biological, because consciousness is software. i.e. the ghost in the machine is equivalent to a complex computer program.Count Lucanor wrote: ↑January 24th, 2023, 1:21 pmIf you really wanted to debate you could have made a little effort to read what's being presented as an argument, so that I didn't have to repeat myself. I told you that you were endorsing the computational theory of mind, which equates the brain with a computer, and I gave you a link that specifically deals with that issue. After you ignored it, you come now to say exactly the same, equating consciousness with software (for the brain hardware), as if that was not the computational theory of mind.
The problem is, I'm not seeking to explain the human mind; which is what a computational theory of mind does. It explains the human mind using computers as an analogy. i.e.
3. The classical computational theory of mind
Warren McCulloch and Walter Pitts (1943) first suggested that something resembling the Turing machine might provide a good model for the mind.
I'm not doing that. That's not the question I'm addressing. I'm addressing the question of computer based consciousness. You may be aware that a google computer scientist, Blake Lemoine became convinced that their chatbot, laMDA, is sentient. And I'm suggesting that, given the massive parallel processing, self correcting diagnostics and feedback loops in the chatbot programming, operating in regard to the sum of human knowledge on the internet, that doesn't strike me as impossible, or even unlikely.
Some years ago I read Hubert Dreyfuss classic 'What Computers Can't Do.' It was a response to the techno-optimism of Herbert A. Simon - who predicted:
1. A computer would be world champion in chess.
2. A computer would discover and prove an important new mathematical theorem.
3. Most theories in psychology will take the form of computer programs.
1. Chess AI is now far in advance of the best chess players in the world - Carlson, Hikaru, Praggnanandhaa are brilliant players, but they're crushed by AI in the way they would crush me at chess.
2. Science daily just reported on AI reducing a 10,000 factor quantum physics problem to four equations.
As for the last prediction, I haven't read this but:
3. Artificial intelligence in Psychology.
Artificial intelligence with human psychological cognition cannot only simulate the rational thinking of “brain,” but also reproduce the perceptual thinking of “heart,” and can realize the emotional interaction between people and machines, machines and machines, similar to human communication.
Cognitive psychology-based artificial intelligence review
https://www.frontiersin.org › fnins.2022.1024316 › full
If true computer consciousness is not here already, it will be soon.
-
- Posts: 377
- Joined: December 17th, 2013, 6:36 pm
Re: Debate with ChatGPT
Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmYou cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet.
Mercury wrote: ↑January 24th, 2023, 1:25 amI think that's incorrect. In my view, the nature of the hardware is irrelevant. Consciousness is a product of parallel information processing; and sufficiently powerful computers are capable of running numerous programs simultaneously, and having the output of one process feed into others. The software architecture of consciousness may not yet have been designed, but it will evolve if it has to write itself. I cannot envisage the computer code equivalent of DNA - or what conditions would select for elements of computer consciousness, but the mechanism of mutation, testing and extinction of arrangements of information is not confined to biology.
[/quote]Underlying much of artificial intelligence research is what Alan Jasanoff calls the cerebral mystique— the idea that the essence of an individual resides in the brain. In The Biological Mind, he argues that this idea neglects the fundamental lesson of neuroscience. The brain is a biological organ embedded in a physical environment. A brain cannot function independently from the body and its surrounding world. Dismantling this myth will allow us to understand what is reasonable to expect from artificial intelligence, as well as technology designed to improve human life.
Although Jasanoff’s book argues from a neuroscience perspective, it supports the idea of embodied cognition—our cognition does not just come from the brain, but from the interaction of a particular being with its environment. Some refer to this as an “ecological approach.” The brain does not make a computation to process sensory input into actions. The brain uses perception and action to link the body to the environment.
Mercury wrote: ↑January 24th, 2023, 1:25 amI'd rather debate with you than with a book neither of us have read. I know I haven't read it, and it doesn't seem you have either because the passage you quote is irrelevant to the argument I'm making. I haven't said anything about what the brain is or is not. What I'm saying is that consciousness can be hosted on platforms other than the biological, because consciousness is software. i.e. the ghost in the machine is equivalent to a complex computer program.
Count Lucanor wrote: ↑January 24th, 2023, 1:21 pmIf you really wanted to debate you could have made a little effort to read what's being presented as an argument, so that I didn't have to repeat myself. I told you that you were endorsing the computational theory of mind, which equates the brain with a computer, and I gave you a link that specifically deals with that issue. After you ignored it, you come now to say exactly the same, equating consciousness with software (for the brain hardware), as if that was not the computational theory of mind.
The problem is, I'm not seeking to explain the human mind; which is what a computational theory of mind does. It explains the human mind using computers as an analogy. i.e.
3. The classical computational theory of mind
Warren McCulloch and Walter Pitts (1943) first suggested that something resembling the Turing machine might provide a good model for the mind.
I'm not doing that. That's not the question I'm addressing. I'm addressing the question of computer based consciousness. You may be aware that a google computer scientist, Blake Lemoine became convinced that their chatbot, laMDA, is sentient. And I'm suggesting that, given the massive parallel processing, self correcting diagnostics and feedback loops in the chatbot programming, operating in regard to the sum of human knowledge on the internet, that doesn't strike me as impossible, or even unlikely.
Some years ago I read Hubert Dreyfuss classic 'What Computers Can't Do.' It was a response to the techno-optimism of Herbert A. Simon - who predicted:
1. A computer would be world champion in chess.
2. A computer would discover and prove an important new mathematical theorem.
3. Most theories in psychology will take the form of computer programs.
1. Chess AI is now far in advance of the best chess players in the world - Carlson, Hikaru, Praggnanandhaa are brilliant players, but they're crushed by AI in the way they would crush me at chess.
2. Science daily just reported on AI reducing a 10,000 factor quantum physics problem to four equations.
As for the last prediction, I haven't read this but:
3. Artificial intelligence in Psychology.
Artificial intelligence with human psychological cognition cannot only simulate the rational thinking of “brain,” but also reproduce the perceptual thinking of “heart,” and can realize the emotional interaction between people and machines, machines and machines, similar to human communication.
Cognitive psychology-based artificial intelligence review
https://www.frontiersin.org › fnins.2022.1024316 › full
If true computer consciousness is not here already, it will be soon.
- Sy Borg
- Site Admin
- Posts: 15148
- Joined: December 16th, 2013, 9:05 pm
Re: Debate with ChatGPT
"Computers are not organisms". Most enlightening. Did you know that stars, planets, rocks and oceans all complexify and metamorphose over time too, not just biology? Entities do not necessarily need the mutation of long-chain nucleotides to change over time. [/quote]Count Lucanor wrote: ↑January 19th, 2023, 11:34 amComputers are not organisms, their level of autonomy is close to zero. Their potential is inevitably determined by the potential of human beings to overcome the inherent design limitations of machines. If there is no evidence of an actual accomplishment, then that’s it, saying that the openness of the future guarantees that it will be accomplished is simply wrong. There’s no evidence that humans will be able to travel to speeds higher than the speed of light; it does not mean that possibility is forever closed to humans, but for now, we cannot assert that it will happen.
No kidding!! Everything changes, nothing remains the same. Nothing in my posts in this discussion suggests the opposite. I didn’t even say anything about computer technology not evolving, in fact I said that they have had and will have an enormous influence in society. I simply don’t share the optimism of those who believe that technology evolves autonomously, the way organisms do, and that AI technology is on its way to acquiring any level of consciousness. The evolution of life has very little in common with the evolution of man-made tools.[/quote
You say "no kidding" as though it's obvious, yet you fail to acknowledge that emergences can result from complexification.
I am not optimistic. Please stop misrepresenting me in this way. I am much older than you are and, like other ancient ones, I don't much care for many changes that run counter to my conditioning. Then again, what I want is no more important than what my dear, departed dog wanted. These are tiny ripples within a tsunami.
The potentials of AI are huge, for better or for worse. Who will benefit or be harmed by these changes, and by how much, is not yet determined. The world is moving into a new phase, and the internet and AI are key to the changes.
There is no doubt going to be increased integration between humans and AI. Neuralink is far from the only company exploring this. As more is learned, researchers move towards mapping human brains and being able to reproduce a brain's dynamics digitally.
As I have often noted, at this stage researchers are yet to reproduce the complexity of even a C. elegans's tiny brain because they are still finding out more about the complexity of neurons and their interactions.
Complaining about types of processes only distracts from the fact that all development need not be of exactly the same nature. AI will undergo its own changes. It will be the first technology capable of creating better versions of itself.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmNonsense. Different processes, different causes, different potentiality of changes. It does matter.Sy Borg wrote: ↑January 19th, 2023, 3:14 pm There is, of course, zero doubt that AI will move FAR beyond its current limits, just as it has already progressed from mere ATMs and supermarket checkouts. What's unknown are the nature and tempo of the changes. Whether humans or AI make the changes is moot, what matters is the changes are happening.
Wrong. Intentions are key, and qualia is not needed. Acquaint yourself with the hypothetical Paper Clip Maximiser.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmFor the purpose of this discussion, it really matters very little what people working on AI want to do. I’m discussing what they CAN effectively accomplish, and giving consciousness to machines is nothing they are close to achieve.Sy Borg wrote: ↑January 19th, 2023, 3:14 pm If you think that all people working on AI will be responsible and not imbue it with increasing autonomy then you have not paid attention to human history. The complexification I spoke about WILL happen - and it will not always be prosaic, "harmless" changes when Chinese and Russia researchers compete for the lead in what promises to be by far the most powerful technology ever known. They have already agreed that there will be "no limits" to what they will do to defeat the west. Everything is on the table, including nukes, germ warfare, hacking and AI drone attacks.
There are lines that the west won't cross, the upshot being that, globally, progress on AI will effectively be completely unregulated. We saw how great empowerment plus weak regulation worked out in 2020-1.
Semi-autonomous drones are already in service and fully autonomous drones have been developed. They are being made by Aerorozvidka, a Ukraine NGO. There is no doubt that at some stage drones (which are certainly not sentient) will be granted/given autonomy to act without supervision because semi-autonomous drones can be interfered with by jamming comms frequencies.Count Lucanor wrote: ↑January 23rd, 2023, 7:52 pmGranted? That doesn’t seem the most appropriate word. You cannot grant anything to a lifeless, inanimate computer circuit, just as you can’t to a puppet. A Tesla car is pretty “autonomous”, too, but it will not move to any place that a human doesn’t want it to be.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: Debate with ChatGPT
The computational theory of mind is what's behind the "sentient theory of computation": once you believe that consciousness and the brain are the equivalent of software and hardware, you start believing that what an advanced computer does can be called consciousness. I remember Blake Lemoine, I felt sorry for him on that occasion, falling so naively for the AI narrative.Mercury wrote: ↑January 24th, 2023, 2:24 pm That's not the question I'm addressing. I'm addressing the question of computer based consciousness. You may be aware that a google computer scientist, Blake Lemoine became convinced that their chatbot, laMDA, is sentient. And I'm suggesting that, given the massive parallel processing, self correcting diagnostics and feedback loops in the chatbot programming, operating in regard to the sum of human knowledge on the internet, that doesn't strike me as impossible, or even unlikely.
If that "true computer consciousness" is something entirely different from the consciousness of living things, I don't see why we would call it consciousness. The label would be misleading.
― Marcus Tullius Cicero
2023/2024 Philosophy Books of the Month
Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul
by Mitzi Perdue
February 2023
Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness
by Chet Shupe
March 2023