Our assumption always is that if people are furnished with the facts, they will be clear thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight. Well, maybe not, according to a study done at the University of Michigan.
Facts don't necessarily have the power to change our minds. Researchers found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Facts could actually make misinformation even stronger among the misinformed as they "dig their heels in" to justify their factually wrong positions.
Research found that many people already have beliefs......a set of facts lodged in their minds. When notions they hold are proven false, instead of changing their minds to reflect the correct information, they entrench themselves even deeper. "The general idea is that for some people, it can be absolutely threatening to admit you're wrong. The phenomenon - know as backfire - is a natural defense mechanism to avoid cognitive dissonance", the lead researcher on the Michigan study said.
We often base our opinions on our beliefs, which can have an uneasy relationship with facts. Rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. Worst of all, our preconceived notions can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we're right, and even less likely to listen too any new information. The effect has been heighten by the information glut.....endless rumors, misinformation, etc. on the Internet and cable TV. In other words, it has never been easier for people to be wrong, and at the same time feel more certain than ever that they're right.
How can we have things so wrong, and be so sure that we're right? Because people tend to interpret information with an eye toward reinforcing their pre-existing views. The research suggests that once those facts, or what are thought of as "facts", are internalized, they are very difficult to budge. It was observed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right 90% of the time, but their confidence makes it nearly impossible to correct the 10% on which they're totally wrong.
It is unclear what drives the behavior. It can range from simple defensiveness to how we are wired. But research found one interesting, rather clear driver. It involves self-esteem. In other words, if you feel good about yourself, you'll listen, you'll be open to the ideas of others, to new information. If you feel insecure or threatened, you are less likely to be open to new information and dissenting opinions. This is why voters who are fearful, insecure, and feel threaten, are so easily controlled and manipulated by politicans.
An interesting book related to this topic is, "On Being Certain - Believing You Are Right Even When You Are Not", by Robert A. Burton, M. D. Dr. Burton, a neurologist, shows and argues that feeling certain - feeling that we know something - is actually a mental sensation, rather than evidence of fact. His work suggests that feelings of certainty stem from primitive areas of the brain and are independent of reasoning. In other words, the feeling of knowing happens to us, we cannot make it happen.........a discussion for another day, but recommended reading.
The University of Michigan study is an interesting study in a time of information overload. The lesson may be to feel good about yourself, be willing to listen to new information, be willing to challenge your preconceived notions, and as Dr. Steven Covey would say, "seek first to understand, then to be understood."
(Source for this piece, "How Facts Backfire", by Joe Keohane - Boston Globe)
Facts don't necessarily have the power to change our minds. Researchers found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Facts could actually make misinformation even stronger among the misinformed as they "dig their heels in" to justify their factually wrong positions.
Research found that many people already have beliefs......a set of facts lodged in their minds. When notions they hold are proven false, instead of changing their minds to reflect the correct information, they entrench themselves even deeper. "The general idea is that for some people, it can be absolutely threatening to admit you're wrong. The phenomenon - know as backfire - is a natural defense mechanism to avoid cognitive dissonance", the lead researcher on the Michigan study said.
We often base our opinions on our beliefs, which can have an uneasy relationship with facts. Rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. Worst of all, our preconceived notions can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we're right, and even less likely to listen too any new information. The effect has been heighten by the information glut.....endless rumors, misinformation, etc. on the Internet and cable TV. In other words, it has never been easier for people to be wrong, and at the same time feel more certain than ever that they're right.
How can we have things so wrong, and be so sure that we're right? Because people tend to interpret information with an eye toward reinforcing their pre-existing views. The research suggests that once those facts, or what are thought of as "facts", are internalized, they are very difficult to budge. It was observed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right 90% of the time, but their confidence makes it nearly impossible to correct the 10% on which they're totally wrong.
It is unclear what drives the behavior. It can range from simple defensiveness to how we are wired. But research found one interesting, rather clear driver. It involves self-esteem. In other words, if you feel good about yourself, you'll listen, you'll be open to the ideas of others, to new information. If you feel insecure or threatened, you are less likely to be open to new information and dissenting opinions. This is why voters who are fearful, insecure, and feel threaten, are so easily controlled and manipulated by politicans.
An interesting book related to this topic is, "On Being Certain - Believing You Are Right Even When You Are Not", by Robert A. Burton, M. D. Dr. Burton, a neurologist, shows and argues that feeling certain - feeling that we know something - is actually a mental sensation, rather than evidence of fact. His work suggests that feelings of certainty stem from primitive areas of the brain and are independent of reasoning. In other words, the feeling of knowing happens to us, we cannot make it happen.........a discussion for another day, but recommended reading.
The University of Michigan study is an interesting study in a time of information overload. The lesson may be to feel good about yourself, be willing to listen to new information, be willing to challenge your preconceived notions, and as Dr. Steven Covey would say, "seek first to understand, then to be understood."
(Source for this piece, "How Facts Backfire", by Joe Keohane - Boston Globe)
No comments:
Post a Comment