• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

The gap between our beliefs about democracy, and objective democracy

Likewise, put them in identically negative environments and they will both likely be harmed, but almost never be equally harmed.

No negative experiences are identical between any two people.

That is just it. An "experience" is a subjective mental state and they do not have a reliable relationship to objective environmental factors, but they are only partly impacted by the objective features of and stimuli within that environment. 100% of the environment's impact on a person is mediated and moderated by how that person's body and brain deal with all the forces and info in that environment.
Two people in the exact same environment are having notably different "experiences", much like a balloon and a bowling ball sitting next to each other on a table do not react the same way to the same breeze that blows through the room when you open a door.
Since no two people (even identical twins) are born with the same body or brain (or genes that heavily impact lifelong development of both), no environment will ever impact two people the same way, even if the two people pass through an identical environment.

Humans are exposed to incredible amounts of "data" as they grow. The experiences and how the psyche deals with experiences moves one mind in one direction and another mind in another.

"Experiences" are not the environment, they are how the mind reacts to an environment, and that reaction is heavily influenced by biological differences that impact the reaction at every level. As you point out, we are exposed to incredible amount of data. In fact, even a single momentary environment we pass through has near infinite info that could be processed, but people only process a tiny fraction of it. That means people will not only differ in what and how much they process from a given environment, but will often overlap in only a tiny portion of what they process while sitting right next to each other in a tiny room. Then, even any overlap in what aspects they both process are processed very differently, with biologically based temperaments causing differences in emotional response to the same stimuli, and biologically based differences in things like neural transmission speed and working memory capacity determining what else that stimuli gets associated with in either that current environment or stored memories. Then, biological differences impact how often that experience gets recalled, and how any recall of it alters the way the info in the current environment is processed and how much the memory of that experienced gets modified by the recollection process (is recalled or new environments trigger aspects of it, those same biologically based differences in the information processing system cause even more differences in the long term cognitive impact of the initial experience.

People grow an individual personality over time based on experience and they also grow cognitive capacities the same way.

Correct, and "experiences" are the outputs of body and mind, with countless innate differences moderating what from the environment gets inputed and how it is processed once inputed. Change and growth in aspects of personality and cognition are built upon what already exists interacting with and altering the influence of new environments. IOW, the impact of biological differences don't just determine the starting point, they modify the impact that environment has for the entire lifespan.
 
I think the above misses the larger point that our common nature, and maybe even environmental circumstances, usually make us very bad at governance and long-term planning. That we're just not cut out for effectively reining in the chaos that exists around the world. So regardless of how you'd define our reasoning skill, it's objective fact that, averaged out, we're shitty at making the most important choices.

And so societies that work perfectly, and efficiently, are an ideal, but very, very hard to accomplish in practice.

I would in fact characterise human beings as optimally rational. That's on average. 'Optimally' here means that given the constraints on the size of our brain and the demands of the ecological niche that Homo sapiens occupied something like 300,000 years ago, our reasoning is most often as good as it could be.

The problem you've identified, that we're bad at long term planning is real.

Long term planning has become a necessity essentially because we've become each other's worst predator and we're broadly of a similar ability as each other, so the ability to plan farther ahead has become a selective advantage, not between individuals but between different types of social organisations. Our social relations have evolved to reflect this by growing in complexity. Countries that had grown more complex were also those able to plan farther ahead and they won the wars. The problems facing humanity are essentially man-made and a by-product of the complexification of our societies. Progress means more machines, more machines mean more people, more people mean more machines, and more machines and people mean more pollution and more global warming.

However, it is nothing new in the history of our species that planning far ahead would be an advantage. Ideally, a species would need to plan ahead to take into account the long-term evolution of its environment but to do so it would need a much bigger brain, which evolution didn't provide us with.

The difference today is that we've been able to put in place organisations whose main job is to plan ahead. So, we sort of discovered long-term planning in the process and this got us thinking about the fact that it would be good if we could do it for ourselves personally. But we're stuck because our brain is the size of our brain, and planning farther ahead would be much more costly in terms of time and energy.

Planning is essentially the consideration of what could possibly happen within a predefined period of time given what the situation is in the present. This is subject to combinatory explosion. The longer the period of time you want to explore, the more possibilities there are and it gets out of hands very, very, very quickly. The strategy used by the brain is to rely on a very simple logic, which has been first described comprehensively by Aristotle. We do all our planning on the back of this one horse. It's quality is that it is sturdy, cheap, and versatile. We can do everything we need with this one horse. It's flaw is that it's not effective to sort out the long-term possibilities because there are too many of them. But, it's just our nature and there won't be any better brain available any time soon.

So we're doing the best we can within the means we have. Optimal. We're as rational as could be given the circumstances.



Also, we comprehensively beat all our competitor species. What we cannot be expected to achieve, though, is to beat each other and no one gets hurt.
EB
 
I think the above misses the larger point that our common nature, and maybe even environmental circumstances, usually make us very bad at governance and long-term planning. That we're just not cut out for effectively reining in the chaos that exists around the world. So regardless of how you'd define our reasoning skill, it's objective fact that, averaged out, we're shitty at making the most important choices.

And so societies that work perfectly, and efficiently, are an ideal, but very, very hard to accomplish in practice.

I would in fact characterise human beings as optimally rational. That's on average. 'Optimally' here means that given the constraints on the size of our brain and the demands of the ecological niche that Homo sapiens occupied something like 300,000 years ago, our reasoning is most often as good as it could be.

The problem you've identified, that we're bad at long term planning is real.

Long term planning has become a necessity essentially because we've become each other's worst predator and we're broadly of a similar ability as each other, so the ability to plan farther ahead has become a selective advantage, not between individuals but between different types of social organisations. Our social relations have evolved to reflect this by growing in complexity. Countries that had grown more complex were also those able to plan farther ahead and they won the wars. The problems facing humanity are essentially man-made and a by-product of the complexification of our societies. Progress means more machines, more machines mean more people, more people mean more machines, and more machines and people mean more pollution and more global warming.

However, it is nothing new in the history of our species that planning far ahead would be an advantage. Ideally, a species would need to plan ahead to take into account the long-term evolution of its environment but to do so it would need a much bigger brain, which evolution didn't provide us with.

The difference today is that we've been able to put in place organisations whose main job is to plan ahead. So, we sort of discovered long-term planning in the process and this got us thinking about the fact that it would be good if we could do it for ourselves personally. But we're stuck because our brain is the size of our brain, and planning farther ahead would be much more costly in terms of time and energy.

Planning is essentially the consideration of what could possibly happen within a predefined period of time given what the situation is in the present. This is subject to combinatory explosion. The longer the period of time you want to explore, the more possibilities there are and it gets out of hands very, very, very quickly. The strategy used by the brain is to rely on a very simple logic, which has been first described comprehensively by Aristotle. We do all our planning on the back of this one horse. It's quality is that it is sturdy, cheap, and versatile. We can do everything we need with this one horse. It's flaw is that it's not effective to sort out the long-term possibilities because there are too many of them. But, it's just our nature and there won't be any better brain available any time soon.

So we're doing the best we can within the means we have. Optimal. We're as rational as could be given the circumstances.



Also, we comprehensively beat all our competitor species. What we cannot be expected to achieve, though, is to beat each other and no one gets hurt.
EB

Good post!
Coloring in some of the spaces... I think we're collectively optimally rational in the sense that you state. On average, individuals may be rational, but they become much less so when gathered into tribes that have to out-compete each other. Long term planning has allowed some collectives to out-perform (kill and/or displace) others, but the evolutionary effect of rational collective consciousness has not yet been seen, because the inter-generational time of collectives is much greater than that of individuals. It is kind of a miracle that long term planning is even possible for us, and it certainly is far from optimized either individually or collectively. Classic example would be global warming. We know it's there and we know what needs to be done about it, both collectively and individually, but we still consume huge amounts of fossil fuels, both collectively and individually (on average). It won't be until collectives or the global gestalt evolves to a point where the needed action is required to preserve a collective or a collection of collectives, that any significant action is taken.
So yeah - we're great at getting on top of the species heap, but not so good I fear, at staying there.
 
The capacity for rationality is innate, but also variable due partly to innate factors, just like virtually all psychological capacities and tendencies. All people can be taught and trained to be better at rationality and to place more value on trying to apply those rational skills. However, some people are born being better at it and better at getting better at it via education. Education actually tends to enhance innate differences between individuals rather than make them more similar. A standard saying in research on basic cognitive skills is "rich get richer", meaning that those who start out with more of the skill also tend to improve that skill via training at a faster rate. Obviously individuals also differ in how much training they get, so a person with more innate skill can wind up with less skill than someone who got much more training to develop their skill.

We use the term ‘rational’, to mean three different things.
First, to speak of a species-wide characteristic of Homo Sapiens: Man is a rational being. This use refers to the ability of normal individual human beings to express their views in a rational way.
Then we can also use the term ‘rational’ to characterise a particular expression of somebody’s views. Somebody’s explanations can be rational or not. An argument will be rational or not.
Thirdly, we can use ‘rational’, or in fact more often ‘irrational’, essentially as a criticism of other people, normally for expressing their view(s) in a way we deem irrational.

Thus, each normal member of the human species can be both rational as to their ability and irrational as to a particular expression of their views, just on one occasion or regularly.

In the first sense, rationality is a general characteristic of all human beings. Only a small percentage of the human population don’t have any ability to express their views in a rational way.

In the third sense, expressing oneself in a rational manner is best understood as either a conscious choice or the result of an unconscious process. This will account for the vast majority of views being deemed irrational, although often somewhat unfairly.


But all of that variability in how much each person is capable of rational thought isn't likely to explain much of the variability in when and who actually engages in rational thought on a topic, and why people given the same info disagree on political issues. Most of that variance is not due to rational skill but rational will, the choice to apply one's reasoning skills to the issue rather than just rationalize and defend whatever claim serves ones political objectives. The other major source of variance is differences in basic values that determine one's political objectives.

The main variability is due to either a conscious choice or the result of an unconscious process (third sense).
If it is a conscious choice, there’s no good reason to assume ipso facto it must be irrational.
As I see it, using an irrational argument is similar to resorting to a lie. A lie doesn’t make the liar irrational. In fact, lying may be the more rational thing to do. An attitude that the liar could conceivably explain in a rational way (I lied because I was in danger of my life).
If it is unconscious, and it may be more often largely unconscious, it may work like language itself. We’re usually not entirely conscious of why we say what we say, even when we deliver a top-notch rational explanation. Rather, it comes out as such. We would typically be conscious of some aspects of why we say what we say, not of all aspects. And then, there would be no difference in this respect between rational and irrational expression.
Thus, opting, somehow, for an irrational expression may be the optimal thing to do and it could be your unconscious supervising your apparent choice.
Experiments where people contradict their own knowledge can then be understood as driven by the interaction between subject and observer. The subject may have a grudge, or some negative bias, against people like the observer, and consciously or unconsciously select an expression that will be seen as irrational by the observer, although it may well be more like a lie, but perfectly rational lie. If so, the observer shouldn’t expect to be able to elicit from the subject a rational account of why they elected to give an irrational expression. Instead, the rational thing to do for the subject will be to keep up the pretence, just like a liar won’t necessarily admit to have lied to you just because you are asking.

Rationality is normally associated both with logic and with verbal expression. Even rational thoughts are supposed to be linguistically framed. Yet, this may be a bias of our linguistic mind and of language as essentially a system of interpersonal and social communication. Our non-verbal rational mind never had a say and its very existence got lost in our communication process.
So, I see the ‘scientific’ view on rationality as affected by communication, cultural and observational biases and unable to account for the facts.


Take human-influenced climate change for example. What is the cause of differences in opinion about whether its happening?
As with evolution, the facts supporting that it is happening are so clear and widespread that it is implausible that any denier reached that position honestly via applying reasoning to the best of their ability. So denial is almost always the result of concerted efforts to violate honest reasoning in order to reach a preferred conclusion. Theological and economic biases are the source of that bias to deny climate change. But not all people who accept climate change arrived at that conclusion rationally either. Some did, but some happen to hold the scientific position, but because they have an ideological bias toward wanting to believe it. Proposed solutions to the problem of climate change entails restricting most of the ways that the rich have gotten rich, and people limiting their consumption, including not eating animals, etc.. Thus, people who have other emotional or ideological reasons to favor such changes, have a biased reason to believe in climate change, and would likely believe it even if the science didn't support it. Of those who accept climate change purely for rational, scientific reasons, some may actually share the same values as either the irrational deniers or irrational accepters, but they value being rational even more so they accept it even though it creates an obstacle for some of their other political objectives. In sum, applying rational thought is most often a result of an interaction between what one's goals are, how much you value rationality in principle, and which conclusion happens to be supported by rational thought.

I see climate-deniers, for example, as possibly rational people electing, somehow, to defend their view in a rational manner. So, how to explain the difference with the climatologists’ near unanimous view?

First, as we have already agreed, many people simply won’t have the information. They will argue rationally from the wrong premises.

Second, some people just do it in the same way that they would tell lies. And it may well be, most of the time, the rational thing to do. In particular, if you have vested interests in the oil or coal industry for example, you may elect to deny that climate change is caused by human beings and that would be the rational thing to do. For some people, it will be more like an actual lie, if they are conscious of why they deny climate change. Other people will do it without really being aware of what they are doing, but still that will be the optimal thing to do, given their personal knowledge and options in life, like short term personal benefit against caring for future generations.
EB
 
I find that humans are irrational about difference among humans. Obviously we're irrational about ourselves as we consider ourselves prey (we fear them) whenever other humans are also on the stage. Our rationality, or capability for rationality doesn't even extend to our kin. So we have levels of trust. Self, then kin,then clan,then tribe, then etc. So we have trouble even making just laws, systems, workplaces etc.

in fact I propose we are irrational. We have the primary capacity to be biased, prejudiced, etc. In fact I our highest value is self. So we are selfish people who parade some level of rationality to get along as a matter of survival. \even survival falls to irrationality when we are sufficiently threatened we commit suicide, or, oppositely genocide. All things human, I believe, turn on self interest precluding any real capacity for rationality. Science, gone in a millisecond should tribal stress dominate.
 
I find that humans are irrational about difference among humans. Obviously we're irrational about ourselves as we consider ourselves prey (we fear them) whenever other humans are also on the stage. Our rationality, or capability for rationality doesn't even extend to our kin. So we have levels of trust. Self, then kin,then clan,then tribe, then etc. So we have trouble even making just laws, systems, workplaces etc.

in fact I propose we are irrational. We have the primary capacity to be biased, prejudiced, etc. In fact I our highest value is self. So we are selfish people who parade some level of rationality to get along as a matter of survival. \even survival falls to irrationality when we are sufficiently threatened we commit suicide, or, oppositely genocide. All things human, I believe, turn on self interest precluding any real capacity for rationality. Science, gone in a millisecond should tribal stress dominate.

I recall hearing an argument on philosophy stack exchange that I liked. I forget the thinker, but someone out there proposed that we've largely developed to interpret objects in the world in terms of their survival value in reference to us, rather than any type of objective processing.

Will it hurt me?
Could we produce babies?
Can they help me in any way?
Will it affect my social status?
etc.

I've found this paradigm useful in understanding many behaviors. To expect people to exceed this boundary is almost always expecting too much, I've known very few who do it.
 
Back
Top Bottom