My thermostat thinks I’m too cold
-
- Posts: 502
- Joined: May 11th, 2021, 11:20 am
My thermostat thinks I’m too cold
Even though the process simply follows the instructions of a program, my thermostat performs the processing. Mindlessly does not apply, because the processing unit of the thermostat—I.e. the brain—must read the instructions in order to process the input.
If an inanimate object can think without possessing consciousness, then certainly a human can. Even in a coma, a human can react to a painful stimulus whenever the stimulus exceeds a certain threshold.
Perhaps thinking is possible in my thermostat because it is animated rather than inanimate. After all, a rock cannot think, or can it? When the law of gravity is applied to a rock on the edge of a precipice, it receives input and decides to roll.
No, I am not crazy. I just have a lot of time on my hands.
-
- Posts: 502
- Joined: May 11th, 2021, 11:20 am
Re: My thermostat thinks I’m too cold
The input for my thermostat is the ambient temperature around it. It thinks I’m too cold if that temperature drops below a given set point.AverageBozo wrote: ↑July 12th, 2021, 11:34 am My furnace’s thermostat receives input, processes that input and then produces output. For all I know, this is evidence that without consciousness it is still possible for inanimate objects to think.
Even though the process simply follows the instructions of a program, my thermostat performs the processing. Mindlessly does not apply, because the processing unit of the thermostat—I.e. the brain—must read the instructions in order to process the input.
If an inanimate object can think without possessing consciousness, then certainly a human can. Even in a coma, a human can react to a painful stimulus whenever the stimulus exceeds a certain threshold.
Perhaps thinking is possible in my thermostat because it is animated rather than inanimate. After all, a rock cannot think, or can it? When the law of gravity is applied to a rock on the edge of a precipice, it receives input and decides to roll.
No, I am not crazy. I just have a lot of time on my hands.
-
- Posts: 10339
- Joined: June 15th, 2011, 5:53 pm
Re: My thermostat thinks I’m too cold
- LuckyR
- Moderator
- Posts: 7932
- Joined: January 18th, 2015, 1:16 am
Re: My thermostat thinks I’m too cold
This is one of those cases where evaluation of the issue is mostly dependent on the exact definition of the terms in the OPAverageBozo wrote: ↑July 12th, 2021, 11:34 am My furnace’s thermostat receives input, processes that input and then produces output. For all I know, this is evidence that without consciousness it is still possible for inanimate objects to think.
Even though the process simply follows the instructions of a program, my thermostat performs the processing. Mindlessly does not apply, because the processing unit of the thermostat—I.e. the brain—must read the instructions in order to process the input.
If an inanimate object can think without possessing consciousness, then certainly a human can. Even in a coma, a human can react to a painful stimulus whenever the stimulus exceeds a certain threshold.
Perhaps thinking is possible in my thermostat because it is animated rather than inanimate. After all, a rock cannot think, or can it? When the law of gravity is applied to a rock on the edge of a precipice, it receives input and decides to roll.
No, I am not crazy. I just have a lot of time on my hands.
- Count Lucanor
- Posts: 2318
- Joined: May 6th, 2017, 5:08 pm
- Favorite Philosopher: Umberto Eco
- Location: Panama
- Contact:
Re: My thermostat thinks I’m too cold
Under that criterion, digestion is the equivalent of thinking. Let's not get into the possible analogies between the outputs.AverageBozo wrote: ↑July 12th, 2021, 11:34 am My furnace’s thermostat receives input, processes that input and then produces output. For all I know, this is evidence that without consciousness it is still possible for inanimate objects to think.
― Marcus Tullius Cicero
- Sculptor1
- Posts: 7091
- Joined: May 16th, 2019, 5:35 am
Re: My thermostat thinks I’m too cold
No.AverageBozo wrote: ↑July 12th, 2021, 11:34 am My furnace’s thermostat receives input, processes that input and then produces output. For all I know, this is evidence that without consciousness it is still possible for inanimate objects to think.
Thermostats are simple switches that work with a simply bi-metalic conductive strip. Two materials laminated together expand in heat to different amounts. This causes it to bend, which will make or brake a simple electrical contact, switching the boiler off or on.
- Sculptor1
- Posts: 7091
- Joined: May 16th, 2019, 5:35 am
Re: My thermostat thinks I’m too cold
This is anthropomorphizing. You are attached the word "thinks" to a think which cannot and does not think.AverageBozo wrote: ↑July 12th, 2021, 11:38 am The input for my thermostat is the ambient temperature around it. It thinks I’m too cold if that temperature drops below a given set point.
- Pattern-chaser
- Premium Member
- Posts: 8268
- Joined: September 22nd, 2019, 5:17 am
- Favorite Philosopher: Cratylus
- Location: England
Re: My thermostat thinks I’m too cold
A thermostat operates by using some physical material that responds directly to changes in temperature. No program; no instructions; no processing; no thinking. ... Not even 'thinking' in the sense we mean when we apply it to a computer and its program.AverageBozo wrote: ↑July 12th, 2021, 11:34 am Even though the process simply follows the instructions of a program, my thermostat performs the processing.
"Who cares, wins"
- Consul
- Posts: 6036
- Joined: February 21st, 2014, 6:32 am
- Location: Germany
Re: My thermostat thinks I’m too cold
(Source: LeDoux, 2019)
The inanimate information processing of thermostats has nothing to do with cogitation (thought) or cognition in the psychological sense.
QUOTE>
"If we are going to explore cognition from an evolutionary point of view, we need a precise definition of what it is. As used here, cognition will refer to processes that underlie the acquisition of knowledge by creating internal representations of external events and storing them as memories that can later be used in thinking, reminiscing, and musing, and when behaving. Its dependence on internal representations of things or events, in the absence of the external referent of the representation, is what makes cognition different from noncognitive forms of information processing. Given this definition, processes that allow behavioral responses to an immediately present stimulus are not, strictly speaking, under cognitive control. Only responses that depend on internal representations are."
(LeDoux, Joseph. The Deep History of Ourselves: The Four-Billion-Year Story of How We Got Conscious Brains. New York: Viking, 2019. pp. 205-6)
<QUOTE
-
- Posts: 2181
- Joined: January 7th, 2015, 7:09 am
Re: My thermostat thinks I’m too cold
It's possible that any/all objects have some sort of experiential states which we don't recognise, but if the behaviour of the thermostat or whatever can be explained without invoking that, then it's reasonable to assume it's not there, for now.AverageBozo wrote: ↑July 12th, 2021, 11:34 am My furnace’s thermostat receives input, processes that input and then produces output. For all I know, this is evidence that without consciousness it is still possible for inanimate objects to think.
Even though the process simply follows the instructions of a program, my thermostat performs the processing. Mindlessly does not apply, because the processing unit of the thermostat—I.e. the brain—must read the instructions in order to process the input.
If an inanimate object can think without possessing consciousness, then certainly a human can. Even in a coma, a human can react to a painful stimulus whenever the stimulus exceeds a certain threshold.
Perhaps thinking is possible in my thermostat because it is animated rather than inanimate. After all, a rock cannot think, or can it? When the law of gravity is applied to a rock on the edge of a precipice, it receives input and decides to roll.
No, I am not crazy. I just have a lot of time on my hands.
Searle's Chinese Room thought experiment looks at a non-experiential computer Behaving/Functioning As If it understands what it's doing, without actually doing so, similar to a thermostat.
''Searle's thought experiment begins with this hypothetical premise: suppose that artificial intelligence research has succeeded in constructing a computer that behaves as if it understands Chinese. It takes Chinese characters as input and, by following the instructions of a computer program, produces other Chinese characters, which it presents as output. Suppose, says Searle, that this computer performs its task so convincingly that it comfortably passes the Turing test: it convinces a human Chinese speaker that the program is itself a live Chinese speaker. To all of the questions that the person asks, it makes appropriate responses, such that any Chinese speaker would be convinced that they are talking to another Chinese-speaking human being.
The question Searle wants to answer is this: does the machine literally "understand" Chinese? Or is it merely simulating the ability to understand Chinese? Searle calls the first position "strong AI" and the latter "weak AI".
Searle then supposes that he is in a closed room and has a book with an English version of the computer program, along with sufficient papers, pencils, erasers, and filing cabinets. Searle could receive Chinese characters through a slot in the door, process them according to the program's instructions, and produce Chinese characters as output. If the computer had passed the Turing test this way, it follows, says Searle, that he would do so as well, simply by running the program manually.
Searle asserts that there is no essential difference between the roles of the computer and himself in the experiment. Each simply follows a program, step-by-step, producing a behavior which is then interpreted by the user as demonstrating intelligent conversation. However, Searle himself would not be able to understand the conversation. ("I don't speak a word of Chinese," he points out.) Therefore, he argues, it follows that the computer would not be able to understand the conversation either.
Searle argues that, without "understanding" (or "intentionality"), we cannot describe what the machine is doing as "thinking" and, since it does not think, it does not have a "mind" in anything like the normal sense of the word. Therefore, he concludes that the "strong AI" hypothesis is false.''
https://en.wikipedia.org/wiki/Chinese_room
2023/2024 Philosophy Books of the Month
Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul
by Mitzi Perdue
February 2023
Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness
by Chet Shupe
March 2023