• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

The Function of Thought

rousseau

Contributor
Joined
Jun 23, 2010
Messages
13,762
A book I've been reading at the moment has got me thinking about thought as a physiological function. Conversation at Talk Freethought often pops up about the mysteries of consciousness and awareness, but I think when you boil it down consciousness is essentially the sum total of our sensory system that orients our behavior across time. This is what I'd argue the function of thought is too - a mechanism to orient our behavior across time.

I heard an interesting comment from abaddon a while back about identifying with our thoughts, which was an interesting way to put it to me. The notion that we are not actually our thoughts, implying that thought is a continual process of our brain that just happens.

So my question is: as a continual process of the brain that orients our behavior across time, what exactly is thought doing?

My thoughts about it are as follows:

- our imagination connects us to the future and allows us to react more appropriately to future obstacles
- our memory connects us to the past and keeps us focused on urgent, immediate problems, and people who are important to us
- there is an inherent immediacy to it, by definition thought will dwell on what is immediately important, and particularly other people
- the sensory system is a data collector that provides raw material to be used by thought to better orient behavior
- on some level thought is connected with visceral feeling, which gives us an incentive to resolve problems

Is that pretty much it? Anything I'm missing?
 
- our imagination connects us to the future and allows us to react more appropriately to future obstacles

Yes. There is an "us". A thing aware of our thoughts and also a driver of thoughts if so inclined.

There is not just thinking. There is a thing that thinks. A thing aware of thoughts and in partial control of thoughts.

- our memory connects us to the past and keeps us focused on urgent, immediate problems, and people who are important to us

Memory does not keep us focused on anything. It is just a tool the thing aware of thoughts and aware of memories can use for whatever it chooses to use it for.

Listen to an old song. Smoke some marijuana. And the memories are just there to contemplate and even enjoy. Nostalgia is an off growth of such enjoyment.

- there is an inherent immediacy to it, by definition thought will dwell on what is immediately important, and particularly other people

Thought will dwell on whatever the thing aware of thought desires to dwell on. It is possible to focus on the now and the visual stimulation. But daydreaming is being unaware of your immediate surroundings and not caring about them and being focused on thoughts unrelated to them.

- the sensory system is a data collector that provides raw material to be used by thought to better orient behavior

It is used by the thing aware of thought. It is also the thing aware of sensory stimulation and emotions and desires and drives.

We have not moved past: I think therefore I exist.

There is still thought. And still the "I" aware of thoughts and able to move thoughts to a degree. Moving thoughts is a great talent. Those that do it well are called geniuses.
 
A book I've been reading at the moment has got me thinking about thought as a physiological function. Conversation at Talk Freethought often pops up about the mysteries of consciousness and awareness, but I think when you boil it down consciousness is essentially the sum total of our sensory system that orients our behavior across time. This is what I'd argue the function of thought is too - a mechanism to orient our behavior across time.
...

I have conscious thoughts as well as unconscious thoughts. Why some thoughts are conscious and some aren't is an excellent question in itself. Thoughts are what the brain does in order to organize information and respond to the environment in order to survive. That's a given. But even a brainless jellyfish can respond to the environment in ways that allow it to orient itself towards food sources. Chemical messages that coordinate detection and locomotion in beneficial ways. But when lots of stimuli and many modes of locomotion need to be coordinated a central processor is needed in order to coordinate them efficiently. Efficiency becomes progressively more important as complexity, size, and density increase. Somehow the ability to create models of the environment emerges. That would be good to understand. But I think it must have a lot to do with maximizing the efficient use of energy and minimizing conflict between the various thought centers within the brain. Plus plain old random neural evolution and complex positive and negative feedback involving support-cell processes.

There must be some survival-related reason that some thoughts are conscious. But I think it has something to do with that ability to create models. Conscious awareness might simply be the awareness of that which the brain knows best: the Self. But I like the idea that it involves an orientation. Conscious awareness is the awareness of some thing vis-à-vis the Self. Yeah, that works.
 
My 'broken record response' is always thought is a function of physical biological processes in the brain. There is no mystery.

Metaphysical attempts to define it are pre scientific, and are hopelessly self referential and circular.

The brain evolved through evolution with no purpose, it just is.

Do animals have thoughts? I'd say yes interpreting animal behavior.
 
My 'broken record response' is always thought is a function of physical biological processes in the brain. There is no mystery.

Metaphysical attempts to define it are pre scientific, and are hopelessly self referential and circular.

The brain evolved through evolution with no purpose, it just is.

Do animals have thoughts? I'd say yes interpreting animal behavior.

All is mystery. All phenomena is complete mystery.

We have no idea what anything is. Only how things are organized and how they behave.

To not see the profound mystery of existence is to not understand it.
 
A book I've been reading at the moment has got me thinking about thought as a physiological function. Conversation at Talk Freethought often pops up about the mysteries of consciousness and awareness, but I think when you boil it down consciousness is essentially the sum total of our sensory system that orients our behavior across time. This is what I'd argue the function of thought is too - a mechanism to orient our behavior across time.
...

I have conscious thoughts as well as unconscious thoughts. Why some thoughts are conscious and some aren't is an excellent question in itself. Thoughts are what the brain does in order to organize information and respond to the environment in order to survive. That's a given. But even a brainless jellyfish can respond to the environment in ways that allow it to orient itself towards food sources. Chemical messages that coordinate detection and locomotion in beneficial ways. But when lots of stimuli and many modes of locomotion need to be coordinated a central processor is needed in order to coordinate them efficiently. Efficiency becomes progressively more important as complexity, size, and density increase. Somehow the ability to create models of the environment emerges. That would be good to understand. But I think it must have a lot to do with maximizing the efficient use of energy and minimizing conflict between the various thought centers within the brain. Plus plain old random neural evolution and complex positive and negative feedback involving support-cell processes.

There must be some survival-related reason that some thoughts are conscious. But I think it has something to do with that ability to create models. Conscious awareness might simply be the awareness of that which the brain knows best: the Self. But I like the idea that it involves an orientation. Conscious awareness is the awareness of some thing vis-à-vis the Self. Yeah, that works.
Creating models sounds about right. There is an obvious advantage to quick reaction times: more effective hunting, better social skills, etc.

The conscious integrates new information into what will later be automatic responses, the already known. With enough time and minimal variation in daily routine you're essentially living on auto-pilot, maximum efficiency and appropriate affect.
 
There are advantages to vision but vision does not arise because it is useful. It arises in a random manner. Gene's mutate and change in a random manner and genes have no idea what their end products are.

If the end products help with survival in some way they may remain along with the genes.

Thinking like all things arises by chance.

It has clear survival advantages so it remains.
 
A book I've been reading at the moment has got me thinking about thought as a physiological function. Conversation at Talk Freethought often pops up about the mysteries of consciousness and awareness, but I think when you boil it down consciousness is essentially the sum total of our sensory system that orients our behavior across time. This is what I'd argue the function of thought is too - a mechanism to orient our behavior across time.
...

I have conscious thoughts as well as unconscious thoughts. Why some thoughts are conscious and some aren't is an excellent question in itself. Thoughts are what the brain does in order to organize information and respond to the environment in order to survive. That's a given. But even a brainless jellyfish can respond to the environment in ways that allow it to orient itself towards food sources. Chemical messages that coordinate detection and locomotion in beneficial ways. But when lots of stimuli and many modes of locomotion need to be coordinated a central processor is needed in order to coordinate them efficiently. Efficiency becomes progressively more important as complexity, size, and density increase. Somehow the ability to create models of the environment emerges. That would be good to understand. But I think it must have a lot to do with maximizing the efficient use of energy and minimizing conflict between the various thought centers within the brain. Plus plain old random neural evolution and complex positive and negative feedback involving support-cell processes.

There must be some survival-related reason that some thoughts are conscious. But I think it has something to do with that ability to create models. Conscious awareness might simply be the awareness of that which the brain knows best: the Self. But I like the idea that it involves an orientation. Conscious awareness is the awareness of some thing vis-à-vis the Self. Yeah, that works.
Creating models sounds about right. There is an obvious advantage to quick reaction times: more effective hunting, better social skills, etc.

The conscious integrates new information into what will later be automatic responses, the already known. With enough time and minimal variation in daily routine you're essentially living on auto-pilot, maximum efficiency and appropriate affect.

As a computer programmer you should be intrigued by how model making could be accomplished out of raw data. I have a BS in Computer Engineering but I've never heard of an application that does that without very specific goals. What's known as Narrow AI is basically just pattern recognition used to accomplish a specific task, right? Maybe that's part of what the brain does but not even close to the whole picture.

Artificial general intelligence (AGI) is the hypothetical ability of an intelligent agent to understand or learn any intellectual task that a human being can. It is a primary goal of some artificial intelligence research and a common topic in science fiction and futures studies. AGI can also be referred to as strong AI, full AI, or general intelligent action. Some academic sources reserve the term "strong AI" for computer programs that can experience sentience, self-awareness and consciousness. Today's AI is speculated to be decades away from AGI.

That article dwells on the various efforts to model the functions of the brain down to the neuron, even to the the molecular level, but it doesn't ask what it means to be able to create a model. Not an easy question. I think a clue can be found in the way the brain seems to love to employ metaphors. Language relies especially on them. Everything is like something else. So it seems that when patterns are detected and reinforced through usage they can be copied or shared in order to model completely different circumstances. And a lot of that occurs unconsciously only to become conscious in a eureka moment of inspiration.
 
The general definition of AI is technology that mimics or emulates facets of humans.

Visual pattern recognition and speech recognition. Robotics.

On a recent NPR show was a talk about habits and the brain. Habits are behaviors that become hard wired which reduces response time and energy as well.

A study showed that learning a new task takes more energy than doing the task once learned.
 
Creating models sounds about right. There is an obvious advantage to quick reaction times: more effective hunting, better social skills, etc.

The conscious integrates new information into what will later be automatic responses, the already known. With enough time and minimal variation in daily routine you're essentially living on auto-pilot, maximum efficiency and appropriate affect.

As a computer programmer you should be intrigued by how model making could be accomplished out of raw data. I have a BS in Computer Engineering but I've never heard of an application that does that without very specific goals. What's known as Narrow AI is basically just pattern recognition used to accomplish a specific task, right? Maybe that's part of what the brain does but not even close to the whole picture.

Artificial general intelligence (AGI) is the hypothetical ability of an intelligent agent to understand or learn any intellectual task that a human being can. It is a primary goal of some artificial intelligence research and a common topic in science fiction and futures studies. AGI can also be referred to as strong AI, full AI, or general intelligent action. Some academic sources reserve the term "strong AI" for computer programs that can experience sentience, self-awareness and consciousness. Today's AI is speculated to be decades away from AGI.

That article dwells on the various efforts to model the functions of the brain down to the neuron, even to the the molecular level, but it doesn't ask what it means to be able to create a model. Not an easy question. I think a clue can be found in the way the brain seems to love to employ metaphors. Language relies especially on them. Everything is like something else. So it seems that when patterns are detected and reinforced through usage they can be copied or shared in order to model completely different circumstances. And a lot of that occurs unconsciously only to become conscious in a eureka moment of inspiration.

Well I think when you break it down most of human behavior surrounds: I have a problem, now do I know the solution. For the overwhelming amount of things that people have to do they already have the solution.

- I need to get into this building so I pull the door open
- I'm hungry so I need to purchase food
- I need money so I need to find a job

In the overwhelming amount of scenarios the logic that we're actually carrying out tends to be fairly basic. Even in complex construction projects and the like you typically have many different people carrying out minor tasks. What's interesting though is how easily people flail and struggle when they actually have to engage their preexisting knowledge to solve a novel problem. Most people are actually very bad at that.

So I think there is a risk in over-stating the importance of model building for the behavior needed in the brunt of a human life, and under-stating how important it is for people to acclimate to the culture they live in. Read: basically just copy and mimic those around them. If you can do what everyone else is doing, and integrate what they tell you already works, most of the time you'll be fine.

But I think model building is essentially the brain creating it's own data points. It's on, all the time, it's stimulated all the time, it is by definition a pattern and association maker. So over time it is also able to create it's own data points, outside of just copying them. These data points are 'solutions', that can be used to create further points.
 
The AI angle is interesting, though. I wonder if one of the difficulties in replicating human intelligence is that biology implies a raise-d'etre. That is, human behavior is intrinsically placed within the context of survival and reproduction. Perhaps technical people have mistaken 'intelligence' as the driver of human behavior, where in human life physiological features like hunger, thirst, sexual desire etc are actually the core features. Our cognition is ultimately guided by, and a slave to those features.

So if you're going to build a 'sentient' AI, what are the core drivers of it's behavior? One would think there'd need to be some type of scope that limits it's behavior. What does it do? Why does it exist?
 
The brain is a multifunctional parallel information processor, producing a wide range of behaviours. Thought being a function of consciousness: the brain's dynamic mental construct of the world and self, evolved as means of interacting with the world and negotiating through its objects and events. An adaptive mechanism.
 
The brain is a multifunctional parallel information processor, producing a wide range of behaviours. Thought being a function of consciousness: the brain's dynamic mental construct of the world and self, evolved as means of interacting with the world and negotiating through its objects and events. An adaptive mechanism.

Right, but the question is what is the function of thought. That thought helps us to negotiate the world is implied in the OP, but the question is how it accomplishes this, what it is actually doing. A cook cooks to produce a meal, what I'm interested in is what happens during the process.

In your view did I pretty much sum it up in the OP, or am I missing anything?
 
It is interesting to note that modern AI's like Deep Mind Alpha emulate organic neural nets more closely than one should expect.

'Nodes' resemble neurons, with many dendrites providing inputs, and a single output (axon) which delivers a result to several or many nodes in the next layer. Learning signals feed back, when feedback is available to increase or reduce synapse weights. Organic brain has a similar mechanism (with feedbacks passing though hippocampus?).

Today's best neural networks are similar to the Back-prop software networks developed circa 1990 with three major enhancements:
* (1) Much deeper and broader networks are in use, due to lowering hardware costs.
* (2) Subnetworks are provided to support shifting and other symmetries
* (3) A different, better, transfer function is used. 1990's Back-prop used a "logistic" (sigmoid) transfer function, with the response to low and high activation asymptotic to constant minimum and maximum levels, respectively. Modern systems like Alpha have a constant minimum, but output continues to increase for high activations.

Points (1) and (2) clearly move the engineered electronic brain toward better emulation of organic brain; but (3) does also! : Cerebral neurons typically have a minimum non-zero firing rate (and zero is a hard lower limit anyway), but can repeatedly fire quickly under high activation.
 
The brain is a multifunctional parallel information processor, producing a wide range of behaviours. Thought being a function of consciousness: the brain's dynamic mental construct of the world and self, evolved as means of interacting with the world and negotiating through its objects and events. An adaptive mechanism.

Right, but the question is what is the function of thought. That thought helps us to negotiate the world is implied in the OP, but the question is how it accomplishes this, what it is actually doing. A cook cooks to produce a meal, what I'm interested in is what happens during the process.

In your view did I pretty much sum it up in the OP, or am I missing anything?

Thought as a part of conscious report, the process of making sense of something, learning how to deal with problems as they present themselves. Many tasks require little or no thought, We get dressed, make breakfast, drive, etc, without the need to think too much about it. A driver cutting into our path may provoke a line of thought, an unfamiliar problem requires a lot of thought.......
 
...
That article dwells on the various efforts to model the functions of the brain down to the neuron, even to the the molecular level, but it doesn't ask what it means to be able to create a model. Not an easy question. I think a clue can be found in the way the brain seems to love to employ metaphors. Language relies especially on them. Everything is like something else. So it seems that when patterns are detected and reinforced through usage they can be copied or shared in order to model completely different circumstances. And a lot of that occurs unconsciously only to become conscious in a eureka moment of inspiration.

Well I think when you break it down most of human behavior surrounds: I have a problem, now do I know the solution. For the overwhelming amount of things that people have to do they already have the solution.

- I need to get into this building so I pull the door open
- I'm hungry so I need to purchase food
- I need money so I need to find a job

In the overwhelming amount of scenarios the logic that we're actually carrying out tends to be fairly basic. Even in complex construction projects and the like you typically have many different people carrying out minor tasks. What's interesting though is how easily people flail and struggle when they actually have to engage their preexisting knowledge to solve a novel problem. Most people are actually very bad at that.

Perhaps people don't trust in their unconscious thoughts enough. They aren't taught to rely on inspiration (hence, the importance of teaching the arts in school). People don't enjoy creative problem solving and would rather rely on following established paths.

So I think there is a risk in over-stating the importance of model building for the behavior needed in the brunt of a human life, and under-stating how important it is for people to acclimate to the culture they live in. Read: basically just copy and mimic those around them. If you can do what everyone else is doing, and integrate what they tell you already works, most of the time you'll be fine.

(Hence religion.) You are of course right about the basics. I'd take it even further and suggest that our very sense of self is based on our ability to identify with and replicate those we see around us at an early age. Even as an adult I see a tendency within myself to take on the characteristics of other people. Whether they are acquaintances or just characters in a movie. But I still think that the basic mechanism requires an ability to create models. A child sees its parents and tries to make sense of their actions. And it eventually begins to recognize similarities in its own actions and desires and begins to build a model of the self. From then on the self becomes the reference point to which every other model becomes oriented, transforming simple awareness into conscious awareness.

But I think model building is essentially the brain creating it's own data points. It's on, all the time, it's stimulated all the time, it is by definition a pattern and association maker. So over time it is also able to create it's own data points, outside of just copying them. These data points are 'solutions', that can be used to create further points.

That sounds interesting but I'm not sure how it could be done. As I said, it would be very good to understand how the brain creates models. When the models are sufficiently reinforced they can interact autonomously. When they deviate from the expected then the Self becomes involved. So I guess you could say it creates new data points.
 
The AI angle is interesting, though. I wonder if one of the difficulties in replicating human intelligence is that biology implies a raise-d'etre. That is, human behavior is intrinsically placed within the context of survival and reproduction. Perhaps technical people have mistaken 'intelligence' as the driver of human behavior, where in human life physiological features like hunger, thirst, sexual desire etc are actually the core features. Our cognition is ultimately guided by, and a slave to those features.

So if you're going to build a 'sentient' AI, what are the core drivers of it's behavior? One would think there'd need to be some type of scope that limits it's behavior. What does it do? Why does it exist?

I might have already mentioned that I think that your raise-d'etre of the brain is to coordinate stimulus-response interactions in order to increase efficiency. Of course there are genetically programmed exigencies that arise in order to conform to any environment. But the brain provides a more useful way for an organism to adapt to more quickly changing circumstances. The same way that cultural changes provide survival advantages over genetic selection because they can occur much more rapidly. But I wouldn't say one is necessarily the slave of the other. Intelligence is also the product of the brain's drive to minimize conflict. The advantage of the intellectual process is that it incorporates random possibilities, similar to evolution. As I've said before, the brain is like an ecosystem of competing concepts. The core driver would be the support cells that provide the resources that activate or suppress one process or another. That state of activation is what we call feelings.
 
Well I think when you break it down most of human behavior surrounds: I have a problem, now do I know the solution. For the overwhelming amount of things that people have to do they already have the solution.

- I need to get into this building so I pull the door open
- I'm hungry so I need to purchase food
- I need money so I need to find a job....

As an animal the world demands movement to survive. Plants survive in place. Animals survive by moving.

There are biological needs. Food, water, shelter.

There are psychological desires that come with being a social animal. Companionship, stable mating partnerships.

This forces the animal to make many decisions.

And that is what the mind (not the brain) is great at. The mind decides how it will act. The brain has no idea any decision has been made. The brain does not think. It is what allows thoughts to occur. The mind is aware of thoughts. Not the brain.

And people don't decide to get a job and work. They are forced to do it.
 
The AI angle is interesting, though. I wonder if one of the difficulties in replicating human intelligence is that biology implies a raise-d'etre. That is, human behavior is intrinsically placed within the context of survival and reproduction. Perhaps technical people have mistaken 'intelligence' as the driver of human behavior, where in human life physiological features like hunger, thirst, sexual desire etc are actually the core features. Our cognition is ultimately guided by, and a slave to those features.

So if you're going to build a 'sentient' AI, what are the core drivers of it's behavior? One would think there'd need to be some type of scope that limits it's behavior. What does it do? Why does it exist?

I might have already mentioned that I think that your raise-d'etre of the brain is to coordinate stimulus-response interactions in order to increase efficiency. Of course there are genetically programmed exigencies that arise in order to conform to any environment. But the brain provides a more useful way for an organism to adapt to more quickly changing circumstances. The same way that cultural changes provide survival advantages over genetic selection because they can occur much more rapidly. But I wouldn't say one is necessarily the slave of the other. Intelligence is also the product of the brain's drive to minimize conflict. The advantage of the intellectual process is that it incorporates random possibilities, similar to evolution. As I've said before, the brain is like an ecosystem of competing concepts. The core driver would be the support cells that provide the resources that activate or suppress one process or another. That state of activation is what we call feelings.

I'm not sure if you fully understood. Human needs like reproduction, hunger, social stimulation, thirst, exercise are ever-present and constantly need to be satisfied, from birth until death. I don't think intelligence is literally a slave to these features (that was just a conceptual metaphor), what I meant is that these human needs dictate what our cognition is doing, most of the time. IOW, cognition doesn't exist in a vacuum, it isn't free from trying to fulfill biological needs.

So not only is it usually (maybe always) oriented to some type of human need, but the models it creates are also going to be fundamentally linked to those needs.

In contrast to AI which I'm very uninformed about but doesn't seem to have any type of intrinsic constraint. Which sounds like it would be a good thing, but maybe if you were creating an intelligent device such a scope and sensory system would need to be present to limit and direct it's behaviour. What needs is it trying to satisfy with it's intelligence?
 
A complex environment requires a certain degree of intelligence, to understand and negotiate, recognition, categorization, food, not food, safe shelter, danger, etc. Instinct as a form of memory only goes so far. The ability to learn and to understand the world at large increases the chance of success. To not only survive but to thrive.
 
Back
Top Bottom