• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Cognitive dissonance and changing beliefs

Brian63

Veteran Member
Joined
Jan 8, 2001
Messages
1,639
Location
Michigan
Gender
Male
Basic Beliefs
Freethinker/atheist/humanist
I find the following 2 articles helpful on understanding cognitive dissonance and skepticism:


What Is Cognitive Dissonance?


Why Bad Beliefs Don’t Die


The second article more explicitly distinguishes between senses and beliefs (and those 2 have no need to cohere with each other, and indeed can have a greater survival benefit when they do not even try to conform to each other). Whenever we are confronted with a scenario where our beliefs are in opposition to sensory data, the brain has no problem prioritizing---it will go with whatever keeps the beliefs alive. As it states itself:

As far as our brain is concerned, there is absolutely no need for data and belief to agree. They have each evolved to augment and supplement one another by contacting different sections of the world. They are designed to be able to disagree...The brain doesn't care whether or not the belief matches the data. It cares whether or not the belief is helpful for survival. Period.



The first article though seems to carry a different sentiment, stating:

Psychologist Leon Festinger first proposed a theory of cognitive dissonance centered on how people try to reach internal consistency.

He suggested that people have an inner need to ensure that their beliefs and behaviors are consistent. Inconsistent or conflicting beliefs lead to disharmony, which people strive to avoid.
(emphasis added)

I am unclear if those 2 statements cohere with each other in a way that I am not seeing, or if they contradict. Our brains do not care whether beliefs and data contradict each other. However, the brain does care when 2 beliefs contradict each other. Is that it?

Thanks.
 
Last edited:
I find the following 2 articles helpful on understanding cognitive dissonance and skepticism:


What Is Cognitive Dissonance?


Why Bad Beliefs Don’t Die


The second article more explicitly distinguishes between senses and beliefs (and those 2 have no need to cohere with each other, and indeed can have a greater survival benefit when they do not even try to conform to each other). Whenever we are confronted with a scenario where our beliefs are in opposition to sensory data, the brain has no problem prioritizing---it will go with whatever keeps the beliefs alive. As it states itself:

As far as our brain is concerned, there is absolutely no need for data and belief to agree. They have each evolved to augment and supplement one another by contacting different sections of the world. They are designed to be able to disagree...The brain doesn't care whether or not the belief matches the data. It cares whether or not the belief is helpful for survival. Period.



The first article though seems to carry a different sentiment, stating:

Psychologist Leon Festinger first proposed a theory of cognitive dissonance centered on how people try to reach internal consistency.

He suggested that people have an inner need to ensure that their beliefs and behaviors are consistent. Inconsistent or conflicting beliefs lead to disharmony, which people strive to avoid.
(emphasis added)

I am unclear if those 2 statements cohere with each other in a way that I am not seeing, or if they contradict. Our brains do not care whether beliefs and data contradict each other. However, the brain does care when 2 beliefs contradict each other. Is that it?

Thanks.

The term "cognitive dissonance" is often misused and applied too broadly to refer to any sort of internal conflict. Festinger coined the term to refer specifically to when one's beliefs conflict with one's behavioral actions, such as when one believes that smoking is bad and yet one smokes. That is not the same as when one's beliefs conflict with sensory data. The conflict between one's own beliefs and actions is unique b/c both things are defining aspects of one's sense of self, and unless one has two selves (which most people would find aversive), then defining aspects of oneself should cohere with each other.

But sensory data is not a defining feature of oneself and it only contradicts with a belief if the person accepts the logical implications of that data. Thus, the conflict can be easily avoided by simply ignoring the logical implications of the data creating new beliefs that allow the data to be explained in a way that is consistent with the beliefs.

However, I'd take issue with the first statement you quoted in that sensory information is what makes beliefs more accurate, and accurate beliefs often are critical for survival and reproductive success. Thus, there has been some selection pressure on our brains to form data-based/sensory-based beliefs, at least under circumstances where the veracity of the the belief directly impacts well being.
 
Festinger coined the term to refer specifically to when one's beliefs conflict with one's behavioral actions, such as when one believes that smoking is bad and yet one smokes.

Smoking is actually one example that the author of the the original uses as well (actually, referring back to Festinger's use of it as an example too). The article's author, Kendra Cherry, says:


"According to Festinger, a person might decide that they value smoking more than health, deeming the behavior "worth it" in terms of risks versus rewards.

Another way to deal with this dissonance is to minimize the potential drawbacks. The smoker might convince himself that the negative health effects have been overstated. He might also assuage his health concerns by telling himself that he cannot avoid every possible risk out there.

Festinger suggested that the smoker might try to convince himself that if he does stop smoking then he will gain weight, which also presents health risks. By using such explanations, the smoker is able to reduce the dissonance and continue the behavior."



This may be a triviality, but I wanted to mention---

In this case it is not that the person thinks smoking is overall bad but they do it anyway, it is that their brains work to minimize the perceived risks of smoking and maximize the perceived rewards of it, so that when the final calculations are done they will think it is overall a good behavior to engage in. Same reason why someone may "forget" to exercise on a certain day or will eat that extra chocolate Swiss Roll. It is not that she knows it is bad and does it anyway, it is that she convinces herself that at-that-moment the short-term sugar rush excitement or the feeling of laying down is more valuable and appreciable than are any other detrimental effects to her health.

Smoking, abusing alcohol and drugs, not exercising, etc. are not objectively "bad" for a person, in other words. They just satisfy different cravings than do the alternative behaviors. If a person does not care about living a long life but really does enjoy eating more of certain foods (which taste good but are unhealthy), then focusing more on the latter is still more in their own self-interest.

But sensory data is not a defining feature of oneself and it only contradicts with a belief if the person accepts the logical implications of that data. Thus, the conflict can be easily avoided by simply ignoring the logical implications of the data creating new beliefs that allow the data to be explained in a way that is consistent with the beliefs.

Agreed. The author (Gregory Lester) of the other article, "Why Bad Beliefs Don't Die" takes up that theme a bit further too. Too many skeptics get frustrated when they present evidence for their position and then the other person just dismisses it in some convoluted way. The skeptic does not realize they are swinging and whiffing every time, wasting their own energy. People have to understand and communicate the implications of the new beliefs and how it would shake up their existing worldview.

However, I'd take issue with the first statement you quoted in that sensory information is what makes beliefs more accurate, and accurate beliefs often are critical for survival and reproductive success. Thus, there has been some selection pressure on our brains to form data-based/sensory-based beliefs, at least under circumstances where the veracity of the the belief directly impacts well being.

That statement of his is surprisingly strong, and I am still trying to assess it. He holds a doctorate in psychology and is an expert in the field and so I give him a lot of credibility right out of the gate. Still, it seems to overstate the case for the reason you mention. I wonder if he did so because he was writing that article in more layperson's terms and so had to take some liberties with his generalizations. On the other hand, he seems rather emphatic about it. I do not know but would be interested in finding out. A few months ago I had tried contacting him but never received a response.
 
However, I'd take issue with the first statement you quoted in that sensory information is what makes beliefs more accurate, and accurate beliefs often are critical for survival and reproductive success. Thus, there has been some selection pressure on our brains to form data-based/sensory-based beliefs, at least under circumstances where the veracity of the the belief directly impacts well being.

That statement of his is surprisingly strong, and I am still trying to assess it. He holds a doctorate in psychology and is an expert in the field and so I give him a lot of credibility right out of the gate. Still, it seems to overstate the case for the reason you mention. I wonder if he did so because he was writing that article in more layperson's terms and so had to take some liberties with his generalizations. On the other hand, he seems rather emphatic about it. I do not know but would be interested in finding out. A few months ago I had tried contacting him but never received a response.

For the record, I hold a doctorate in Psychology as well. But I don't think that expertise is highly relevant here, b/c it is really just a matter of combining the principle of natural selection with the assumptions of empiricism on which all science is based, namely that we construct, test, and revise beliefs to be more accurate using sensory observations. The only alternative is that either our beliefs are entirely random and we have no basis to form more or less accurate ones, or to buy into supernatural woo nonsense, such as of divine revelation.

I guess psychology comes into play in recognizing that we come into the world with almost no information and few "ideas", and that the senses is how we gather all information we are not born with, and thus is the source of all new ideas. Those ideas/beliefs are what drive our behavior. If we form beliefs that safe and nutritious things are dangerous, we starve and don't pass on our genes. If we form beliefs that dangerous things are cuddly and or good for us, then we die and don't pass on our genes. Sensory information is basic realities of the world physically impacting our biological systems. If we didn't coordinate our beliefs to that sensory information to a large degree, we would have zero chance of having even remotely accurate beliefs about the reality around us and we'd die on our first day without a caregiver.

Imagine you come across an object, and that object is a hungry lion. Ignoring the information you have gained from past sensory experiences, there are an infinite number of beliefs you could form about what that lion is, including ignoring your current sensory info completely and believing that there actually no object at all in front of you. There are a limited # of actions that will make you survive. And since actions are driven by beliefs, a limited number of beliefs that make you survive. If you truly disregard sensory information in how you form beliefs, you would arrive at a belief at random and your odds of engaging in a pro-survival belief would be near zero.

I suspect that author would agree with all that, in fact he states that what matters is that your beliefs help you survive. Well since most beliefs are about what the immediately world around us is and is doing and how do we get the most our of it without dying, having beliefs that are strongly tied to the immediate world around us is critical to survival. Other than woo magic, how would that happen without beliefs being tethered to sensory information that constantly provides info about the world around us?

I suspect you are right that he's trying to sell a book to the general public, and thus like most scientists selling books (including Pinker), he overstates his point for effect. I think the more modest and more accurate summarizing of his point would be that we don't always need our beliefs to be accurate in order for them to help with survival. For example, most people can survive quite fine disbelieving in evolution b/c being wrong about it has very little relevance to everyday life. So, our brains clearly can process sensory data without a corresponding change in beliefs. But have most people in human history gone around forming more ideas about such abtractions or about everyday things that directly impact how they act and react to the world around them?
 
Thanks for your contributions on this topic, Ron. Threads on the forum often become needlessly heated and antagonistic, but it is nice to sometimes have a friendly and very educational experience too. So I appreciate any thoughts you have on the matter. In general what you say and what Lester says make a lot of sense to me, yet they seem incongruent, and I am very confused on how to reconcile them all and am very interested in sorting it out. If you have not already, I would strongly suggest directly reading Lester's full article (linked in the OP) rather than relying on my excerpts of it or attempts to characterize it.

Well since most beliefs are about what the immediately world around us is and is doing and how do we get the most our of it without dying, having beliefs that are strongly tied to the immediate world around us is critical to survival.

I think Lester would say that by accepting misinformation or being biased in certain situations, it allows us to temporarily survive longer. There are actually enough exceptions to the rule (where relying on accurate data is more helpful), that that generalization just never carries much weight with it. Or it does on a more macro scale when thinking in terms of general populations, but on the micro scale each individual benefits more from holding biased beliefs than they do in holding accurate beliefs.

He gives some examples of where our senses cannot provide us with reliable information, and we form helpful precautionary beliefs and take corresponding actions, despite what our senses say is the case.

Other than woo magic, how would that happen without beliefs being tethered to sensory information that constantly provides info about the world around us?

That is where I think the transaction costs of changing beliefs comes into play. If it was free and easy to change beliefs, then there would be a stronger propensity for people to rely on more accurate empirical data. Changing beliefs, however, comes with a steep price (for many beliefs). It upsets other beliefs we hold. It decreases our self-confidence (or even our arrogance) in our skills to distinguish what is true or false. Etc.
 
A lot of concrete, as in cement headed, thinking behind the conclusions above. Why would fitness have anything to do with changing from mode A to mode B when neither is actually what happens nor relevant to fitness. I think better would be social psychologists apparently have the luxury of inventing things as topics for books which vaccination deniers then use as resources for prejudices. More on point. Remember. Dissonance is actually based on capitalizing on the fact that some animals, like cats and rats, are floaters and would rather be up than under.
 
Why would fitness have anything to do with changing from mode A to mode B when neither is actually what happens nor relevant to fitness.

You think what we believe about the world has no impact on our fitness in it?

Our beliefs we hold impact the decisions we make.
The decisions we make impact the actions we take.
The actions we take impact our welfare and fitness in the world.

Which point of those 3 are you disputing, if any? If you would accept all 3 as premises, then the consequence would be that our beliefs we hold impact our welfare and fitness in the world.
 
Last edited:
Cognitive dissonance avoidance - acceptance occurs when for example Christians are so cross conditioned way beyond therapy to even consider any actions such as Islamophobia pedophilia anything other than federal sin if they could even recognize doing that.
 
Emotional appeal overriding logic and reason. Whether consciously or unconsciously, to dismiss whatever happens to put one's favoured view in question.
 
Why would fitness have anything to do with changing from mode A to mode B when neither is actually what happens nor relevant to fitness.

You think what we believe about the world has no impact on our fitness in it?

Our beliefs we hold impact the decisions we make.
The decisions we make impact the actions we take.
The actions we take impact our welfare and fitness in the world.

Which point of those 3 are you disputing, if any? If you would accept all 3 as premises, then the consequence would be that our beliefs we hold impact our welfare and fitness in the world.

I'm disputing all three 'points'. Your agency is not really related to your fitness. It's chance that determines the genes and conditions in which your genes are expressed. It's not about you. It's about the survival of those who reproduce. Your agency doesn't determine how good you see, react, feel, in whatever you find yourself. If agency were important to fitness most every living would possess agency. No philosopher who ever wrote claims being life matters and agency wouldn't explain why plants evolve.
 
Last edited:
Emotional appeal overriding logic and reason. Whether consciously or unconsciously, to dismiss whatever happens to put one's favoured view in question.

Acceptance of beyond the pleasure principle imprinting flawed logic & reason. This national religion of Christiananality Islamophobia Pedophilia can't be based on 'forgive them they know not what they do' is cognitive avoidance in dissonance acceptance; which stimulates suicidal super ego homicidal sociopsychopathilogical human farming of jihad & lynching.
 
It's chance that determines the genes and conditions in which your genes are expressed. It's not about you.

The genes and conditions in which my genes are expressed are indistinguishable from me. They are me. Saying "me" or "I" is just a shorthand way of saying "the genes and environment surrounding those genes." Same goes for every other person and every other living being. I am certainly not claiming there is anything beyond our material or physical genes, such as any spirit or soul.

Going back to the first point again as an example: "Our beliefs we hold impact the decisions we make."

That is just a more lucid way of stating "The beliefs formulated by our genes and the environment surrounding those genes impacts the decisions that the brain makes."

So I think we are in agreement, but are just using different terms and phrases to make the same points. I just said "I" where you say (effectively) that there is no such thing as "I" and there is simply a set of genes and the environment. Well, okay, I just said "I" as an abbreviation for that. It does not change the substance of the point though.
 
The genes and conditions in which my genes are expressed are indistinguishable from me.


We aren't saying anything like the same thing. The 'I' is a self defined construct. Its relationship to genes and the like are only in the approximations that we make from our uninformed points of view we take as a subest for the 'me' construct.

Fitness has no place for such a construct since it is defined as a stochastic process. I gave you a big clue when I included trees as being life. They have no nervous system, no mechanism for agency, yet they follow the same dictates for fitness as humans. I can reach out and find rationales to interpret treeness with agency without mind but that would be silly and useless when it's perfectly obvious that what we are talking about is not the result of agency.

Others have often made the same mistake. Horace Barlow famously/sillily said that a cat neuron firing when he was in it's field meant the cats had Barlow face detectors. He's also the one who announced finding monkey claw detectors since they fired three time in the presence of a monkey's hand in these same cats. Famous ethologists and behavioral scientists have been just as prone to such thinking. Bad thinking bad science.

Scientists in different fields put as strict a set of criteria for significance as is their ability to measure permits. With the latest finding being in particle physics being a rather unexciting find of Higgs at six sigma probably means that criteria there will be raised in the near future. If we fail to improve critierta as our ability to catalog and measure increases we'll be permitting such as has been happening in medical science in the form of greed and fraud because of it to take place there as well.

me is not totality of genes and experience it is a rather shabby subset based on the need for agency by large egos as a trait. Put another way a bigger broom does not make a cleaner floor.
 
Last edited:
Fitness has no place for such a construct since it is defined as a stochastic process. I gave you a big clue when I included trees as being life. They have no nervous system, no mechanism for agency, yet they follow the same dictates for fitness as humans. I can reach out and find rationales to interpret treeness with agency without mind but that would be silly and useless when it's perfectly obvious that what we are talking about is not the result of agency.

Trees do not possess agency, agreed. Are you saying that because trees exist still despite not having agency, that agency is irrelevant to how fit a certain arrangement of genes/environment is? That would be a non sequitur. Some people are born blind, and they live for decades. That would not imply that the ability to process photons is irrelevant to your likelihood to survive in the world. Each arrangement of genes/environment carries certain advantages and disadvantages. That is a different statement from saying that genes/environment are irrelevant.

Similarly to the quality of having vision or being blind, or being able to perceive audio or being deaf, or possess the ability to smell or not, or feel or not---the quality of agency and having a brain with consciousness to process all that information is overall more useful for some organisms in comparison to the costs, while for other organisms it is more burdensome and does not provide sufficient advantages. That is different than arguing the sweeping idea that having agency is irrelevant for the fitness for all organisms.
 
Thanks for your contributions on this topic, Ron. Threads on the forum often become needlessly heated and antagonistic, but it is nice to sometimes have a friendly and very educational experience too. So I appreciate any thoughts you have on the matter. In general what you say and what Lester says make a lot of sense to me, yet they seem incongruent, and I am very confused on how to reconcile them all and am very interested in sorting it out. If you have not already, I would strongly suggest directly reading Lester's full article (linked in the OP) rather than relying on my excerpts of it or attempts to characterize it.

Well since most beliefs are about what the immediately world around us is and is doing and how do we get the most our of it without dying, having beliefs that are strongly tied to the immediate world around us is critical to survival.

I think Lester would say that by accepting misinformation or being biased in certain situations, it allows us to temporarily survive longer. There are actually enough exceptions to the rule (where relying on accurate data is more helpful), that that generalization just never carries much weight with it. Or it does on a more macro scale when thinking in terms of general populations, but on the micro scale each individual benefits more from holding biased beliefs than they do in holding accurate beliefs.

Eh. Think about all your own beliefs. Remember, that virtually everything you think is a belief, and everything you do is based on countless beliefs. The mere act of driving your car entails hundreds of beliefs about the world, how each part of your car works, how other drivers and pedestrians are likely to behave, what words on signs means, what a red light means, etc.. It all those beliefs are not largely accurate (and they only are to the extent you formed them based upon sensory experiences), then you'd be dead before you got a mile down the road.

Sure, we do skew some beliefs to our emotional advantage or default to beliefs that are a safer bet even if wrong (e.g., it's safer to assume an unknown entity is a threat). But in order to act and react to these beliefs in a way that is useful, we have have basic knowledge of the world around us, and that "knowledge" is itself just sensory-based belief. Psychologically, there is no difference between "knowledge" and "belief". There are just beliefs that are more and less accurate, due to whether they are logically coherent with sensory information, making what we refer to as "knowledge" a subset of belief.
 
The mere act of driving your car entails hundreds of beliefs about the world, how each part of your car works, how other drivers and pedestrians are likely to behave, what words on signs means, what a red light means, etc.. It all those beliefs are not largely accurate (and they only are to the extent you formed them based upon sensory experiences), then you'd be dead before you got a mile down the road.

Agreed. Those are examples where our brains try to avoid being wrong, because the consequences of being wrong would be very detrimental. In plenty of other cases, it is simply not the case that being wrong harms us. It very often actually benefits us. This short 3 minute video (Why facts don't convince people) goes over some examples, especially when they intersect with our other tribalistic views on politics and religion. We will go with whatever our biased tribe says is right, rather than do any investigating of our own. If we did start to suspect that our own tribe was actually wrong this whole time, that puts us under stress and anxiety.

Sure, we do skew some beliefs to our emotional advantage or default to beliefs that are a safer bet even if wrong (e.g., it's safer to assume an unknown entity is a threat). But in order to act and react to these beliefs in a way that is useful, we have have basic knowledge of the world around us, and that "knowledge" is itself just sensory-based belief.

I would make an important change to the end of that last statement. It is not "just sensory-based belief," but rather is "heavily-biased sensory-based belief." We will dismiss or ignore or minimize inconvenient sensory data up to a threshold, where the harm in doing so exceeds the benefits. Using the earlier example, we gain no real benefit in thinking we have a green light when actually it is red. Little (if any) reward, but extremely high risk.

Plenty of other beliefs skew the opposite direction and there are abundant rewards (and little costs) for being closed-minded to conflicting sensory input. Those with liberal political views will avoid watching Fox News, conservatives will gravitate towards watching Fox News. So they both base their beliefs on sensory data, same as in the stoplight example, but in this case they choose what they consider to be (accurate) sensory data. We do not get to choose whether the stoplight is red or yellow or green, that sensory data is imposed on us. (We have no strong incentive to think it is something else anyway, like we do for other beliefs.).

Overall I think we see it the same, but you mention that we skew "some" beliefs and I would modify that to read "many or most" of our beliefs are biased (just to different degrees) but I do not know how either of us could objectively quantify it. It seems just a subjectively different guesstimate.
 
The mere act of driving your car entails hundreds of beliefs about the world, how each part of your car works, how other drivers and pedestrians are likely to behave, what words on signs means, what a red light means, etc.. It all those beliefs are not largely accurate (and they only are to the extent you formed them based upon sensory experiences), then you'd be dead before you got a mile down the road.

Agreed. Those are examples where our brains try to avoid being wrong, because the consequences of being wrong would be very detrimental. In plenty of other cases, it is simply not the case that being wrong harms us. It very often actually benefits us. This short 3 minute video (Why facts don't convince people) goes over some examples, especially when they intersect with our other tribalistic views on politics and religion. We will go with whatever our biased tribe says is right, rather than do any investigating of our own. If we did start to suspect that our own tribe was actually wrong this whole time, that puts us under stress and anxiety.

Sure, we do skew some beliefs to our emotional advantage or default to beliefs that are a safer bet even if wrong (e.g., it's safer to assume an unknown entity is a threat). But in order to act and react to these beliefs in a way that is useful, we have have basic knowledge of the world around us, and that "knowledge" is itself just sensory-based belief.

I would make an important change to the end of that last statement. It is not "just sensory-based belief," but rather is "heavily-biased sensory-based belief." We will dismiss or ignore or minimize inconvenient sensory data up to a threshold, where the harm in doing so exceeds the benefits. Using the earlier example, we gain no real benefit in thinking we have a green light when actually it is red. Little (if any) reward, but extremely high risk.

Plenty of other beliefs skew the opposite direction and there are abundant rewards (and little costs) for being closed-minded to conflicting sensory input. Those with liberal political views will avoid watching Fox News, conservatives will gravitate towards watching Fox News. So they both base their beliefs on sensory data, same as in the stoplight example, but in this case they choose what they consider to be (accurate) sensory data. We do not get to choose whether the stoplight is red or yellow or green, that sensory data is imposed on us. (We have no strong incentive to think it is something else anyway, like we do for other beliefs.).

Overall I think we see it the same, but you mention that we skew "some" beliefs and I would modify that to read "many or most" of our beliefs are biased (just to different degrees) but I do not know how either of us could objectively quantify it. It seems just a subjectively different guesstimate.

I'd argue that the ratio of situations/ideas where it is useful to be accurate vs. more useful to be inaccurate is at least 10:1 and that is being generous. Religious and political beliefs comprise a very tiny fraction of people's beliefs. Again, just walking across the room requires acting upon hundreds of beliefs. And even when a belief is biased, it is typically only biased in some particular way, where 90% of the contents of the belief are largely accurate in their representation of sensory input.
So, a belief forming brain system would still have huge selection pressure to use sensory information as the primary basis for the concepts and ideas (aka, beliefs) that the organism forms about the it's environment.
 
I do not know if you have read Lester's article (Why Bad Beliefs Don't Die) or are working off my quotes of it, but Lester uses a similar analogy on page 2:

As I sit in my living room I cannot see my car. Although I parked it in my driveway some time ago, using only immediate sensory data I do not know if it is still there. As a result, at this moment sensory data is of very little use to me regarding my car. In order to find my car with any degree of efficiency my brain must ignore the current sensory data (which, if relied on in a strictly literal sense, not only fails to help me in locating my car but actually indicates that it no longer exists) and turn instead to its internal map of the location of the car.

I think this is a place where you and Lester would part ways as well, regarding the numbers. Most of the contents of the beliefs we hold are not based on immediate sensory data. When we walk across the room we rely on senses to form beliefs as you mention, and doing so is easy while doing otherwise would be perilous.

A person who wants to confirm they still have a car on the driveway has to get up out of their chair though, which requires that we expend a little bit of energy for almost no reward (assuming they had no suspicion to think their car was stolen). A person at night can want to confirm that the sun still exists even though they cannot see it. They could travel across the globe to do it, or just ignore their immediate (lack of) sensory data and instead rely on memory of how the Earth rotates to make that determination. The latter gives about the same reward, but at much less expense.

So it is a lot more than religious and political beliefs where we deny our sensory input when forming beliefs. We do it throughout our daily lives as well. I goofed earlier by limiting my examples to just religion and politics.
 
I do not know if you have read Lester's article (Why Bad Beliefs Don't Die) or are working off my quotes of it, but Lester uses a similar analogy on page 2:

As I sit in my living room I cannot see my car. Although I parked it in my driveway some time ago, using only immediate sensory data I do not know if it is still there. As a result, at this moment sensory data is of very little use to me regarding my car. In order to find my car with any degree of efficiency my brain must ignore the current sensory data (which, if relied on in a strictly literal sense, not only fails to help me in locating my car but actually indicates that it no longer exists) and turn instead to its internal map of the location of the car.
.

But that internal map is based upon sensory data, and it is only useful in helping us find our car if it is objectively accurate in it's representation of that sensory data. In fact, almost every aspect of our memories is based upon sensory data. Of course, we cannot just rely upon immediate sensory data. In fact, beliefs are inherently not just immediate sensory data, but a way to summarize the result of past sensory data. Sometimes we use that past sensory data in the form of sensory-driven beliefs instead of immediate sensory data (as with the finding our car example), and sometimes we use it in combination with immediate sensory data and/or to make sense of immediate data (such as when we see a lion and connect it to our belief that lions are dangerous based upon our past sensory experiences of creatures that looked similar and who we saw rip the throat out of a zebra.

In fact knowing when and how to use immediate vs. past sensory data (in the form of memories/beliefs) is itself often guided by having accurate sensory based memories of our own actions. For example, you remember the act having to drive around a bit to find parking last night and then walking a couple blocks from your car to home. That sense-based memory cues you to not rely on just looking out your window to see where the car is but rather the try and retrieve the memory of where you parked, and if that fails, you try to reconstruct where you parked based on other sense-memories (e.g., how far did I walk?, what did I pass on the walk home?), and if that fails you use sense-based memories or where you have parked in the past when there was no parking out front. IOW, his example is actually a glaring example of relying upon sense-based experiences and beliefs to achieve a goal, which we do every second of the day whether to make sense of immediate sense information of instead of that information when it's of little use.
 
This is the last thing I'll say on the matter, and it related more broadly to the thesis you and Lester are advancing.

The idea that wrong beliefs don't die/change in the face of sensory info b/c our brains aren't evolved to form sensory-based beliefs has it all backwards. It is precisely b/c we evolved to form accurate sensory-based beliefs that our beliefs are generally resistant to change. If a wrong belief that ignored sensory information manages to be formed and survive despite natural psychological pressures against it, then it must be of the relatively uncommon sort that serves a useful function despite being wrong.
And if a belief serves such a useful function that can survive pressures against wrong, non-sensory beliefs, then it is going to be highly resistant to change in the face of new sensory information. But these are examples of exceptions, not the rule.

Also, wrong beliefs resist change b/c all our beliefs resist change, and is precisely b/c the vast majority of our beliefs are sense-based, largely accurate beliefs, and it is adaptive for people's accurate beliefs to be resistant to change if they are predictive and useful. Sensory based beliefs can be wrong, due to non-representative samples of experience. However, most of the time, our sample of experience is relatively random, so they are representative. If most members of a species are X, then your first experience with that species will likely be X. So, forming a belief that things that look like it are X will be largely correct most of the time. As you have more experiences that reinforce that and more reactions based on those beliefs that are useful b/c they are correct, then those beliefs become more and more resistant to change, which is adaptive. Someone comes along and shows you that your sample was biased and a better sample shows that the species is not predictive of X, you will resist changing your belief. That isn't b/c you don't form sense-based beliefs generally, but b/c you do and you have mostly sense-based beliefs that are useful mostly b/c they are accurate, so you should generally resist changing them unless their is a compelling reason to do so.
 
Back
Top Bottom