Why we hate to change our minds
Monday, February 13, 2017 85 Comments
The Greater our Investment
The greater the likelihood
we will hold on to ideas that don’t serve us
© Madelyn Griffith-Haynie, CTP, CMC, ACT, MCC, SCAC
Foundational Concept of the Intentionality Series
Opinions vs. Facts
Sometimes people hold a core belief that is very strong. Presented with conflicting information, accepting the new evidence would create a feeling that is extremely uncomfortable (called cognitive dissonance).
And because it is so important to protect that core belief, they will rationalize, ignore, and even deny anything that doesn’t fit with the core belief.~ Franz Fanon, Free Your Mind and Think
There has been a great deal of research and writing on the implications of the concept of confirmation bias. I have often referred to the concept here on ADDandSoMuchMORE.com, so many of my regular readers are already familiar with the expression.
Given today’s political climate, I believe it is time to review a few ideas
as we all attempt to make sense of what’s going on.
Some of you will recall seeing the information in the box below – but I believe it will be useful to take a moment to reread it as an introduction to this particular article.
Confirmation bias is a term describing the unconscious tendency of people to favor information that confirms their hypotheses or closely held belief systems.
Individuals display confirmation bias when they selectively gather, note or remember information, or when they interpret it in a way that fits what they already believe.
The effect is stronger for emotionally charged issues, for deeply entrenched beliefs, when we are desperate for answers, and when there is more attachment to being right than being effective.
How it tends to work
Human beings will interpret the same information in radically different ways to support their own views of the themselves. We hate to believe that we might have been wrong — especially when we have invested time and energy coming to a decision.
Studies on fraternity hazing have shown repeatedly that, when attempting to join a group, the more difficult the barriers to group acceptance, the more people will value their membership.
To resolve the discrepancy between the hoops they were forced to jump through and the reality of whatever their experience turns out to be, they are likely to convince themselves that their decision was, in fact, the best possible choice they could have made.
Similar logic helps to explain the “Stockholm Syndrome,” the actions of those who seem to remain loyal to their captors following their release.
People quickly adjust their opinions to fit their behavior — sometimes even when it goes against their moral beliefs overall. We ALL do it at times, even those of us who are aware of the dynamic and consciously fight against it.
It’s an unconscious adaptation that is a result of the brain’s desire for self-consistency. For example:
- Those who take home pens or paper from their workplace might tell themselves that “Everybody does it” — and that they would be losing out if they didn’t do it too.
- Or they will tell themselves, perhaps, “I’m so underpaid I deserve a little extra under the table – they expect us to do it.”
And nowhere is it easier to see than in political disagreements!
When validating our view on a contentious point, we conveniently overlook or “over-ride” information that is at odds with our current or former opinions, while recalling everything that fits with what is more psychologically comfortable to believe – whether we are aware of it consciously or not.
We don’t have to look further than the aftermath of the most recent election here in America for many excellent examples of how difficult it is for human beings to believe that maybe they might have been wrong.
To understand why, we need to look briefly at another concept that science has many studies to support: cognitive dissonance.
Don’t forget that you can always check out the sidebar
for a reminder of how links work on this site, they’re subtle ==>
The unconscious workings of cognitive dissonance
Social psychologists studying cognitive dissonance have long been interested in the way we deal with two thoughts that contradict each other – and how we deal with the discrepancy.
As as early as 1959, a ground-breaking experiment by Leon Festinger and James Carlsmith provided some fascinating insight into the explanations we come up with to justify our thoughts and actions, and why we adapt our thinking the way we do.
Their conclusions are still being recognized as valid today.
Those who are interested can find a plain-language description of the study HERE.
So what IS Cognitive Dissonance?
It has been said that cognitive dissonance is a mismatch between what one believes and what the evidence supports. That’s part of it, but not all — and certainly not a definition likely to get anybody to change his or her mind. Let’s open the paradigm a bit.
Cognitive dissonance is the term that science uses to describe that highly uncomfortable feeling of internal tension that comes from holding two conflicting thoughts at the same time — for example, I have always been a pacifist but I just enlisted in the Army.
Almost immediately, the brain searches for ways to reduce the dissonance — especially following a decision — either by redefining our terms or finding a way to justify the actions that don’t seem to support them.
I’m sure you can come up with at least a few comments you’ve heard to expand on my example above.
Two interesting conclusions continue to emerge as science continues to explore the concept.
- When a person has been required or convinced to say or do something in conflict with a previously held opinion, there will be a tendency for him to modify the opinion to bring it into correspondence with what he has said or done.
- The greater the pressure used to convince people to change their opinion, the less likely they are to rethink the opinions and the more likely they are to justify the decisions.
That finding alone underscores the need for caution when we attempt to “logic” somebody out of an opinion or belief.
If they suspect that we think that continuing to believe what they do is stupid, for example, it will unconsciously activate their “I am NOT a stupid person” programming, and you’ve lost the debate, possibly forever.
As I continue to say on this brain-based blog: Make-Wrong NEVER works!
Two studies reported by Irving Janis and Bert King (1954;1956), published in The Journal of Abnormal and Social Psychology, showed that when individuals were required to improvise a speech supporting a point of view with which they disagreed, their private opinion moves closer to the position they advocated in the speech.
The change in opinion has been observed to be greater than the change of opinion in those who only hear the speech, or in those who simply read a prepared speech, told to focus on execution and delivery alone — indicating that those who improvise convince themselves, even if they are unable to convince others.
THAT certainly explains a lot of the “alternative facts” we’ve heard in the news
and around the ‘net since November, doesn’t it?
The importance of keeping this concept in mind
Neuroscientist-turned-novelist, Dr. Robert Burton (On Certainty: Believing You Are Right Even When You’re Not), had this to say about it all during an interview on the Brain Science Podcast:
“My goal is to strip away the power of certainty by exposing its involuntary neurological roots. If science can shame us into questioning the nature of conviction, we might develop some degree of tolerance and an increased willingness to consider alternative ideas.”
World Peace, perhaps?
© 2017, all rights reserved
Check bottom of Home/New to find out the “sharing rules”
(reblogs always okay, and much appreciated)
Shared on the Senior Salon
As always, if you want notification of new articles in this Series – or any new posts on this blog – give your email address to the nice form on the top of the skinny column to the right. (You only have to do this once, so if you’ve already asked for notification about a prior series, you’re covered for this one too). STRICT No Spam Policy
IN ANY CASE, do stay tuned.
There’s a lot to know, a lot here already, and a lot more to come – in this Series and in others.
Get it here while it’s still free for the taking.
Want to work directly with me? If you’d like some coaching help with anything that came up while you were reading this Series (one-on-one couples or group), click HERE for Brain-based Coaching with mgh, with a contact form at its end (or click the E-me link on the menubar at the top of every page). Fill out the form, submit, and an email SOS is on its way to me; we’ll schedule a call to talk about what you need. I’ll get back to you ASAP (accent on the “P”ossible!)
You might also be interested in some of the following articles
available right now – on this site and elsewhere.
For links in context: run your cursor over the article above and the dark grey links will turn dark red;
(subtle, so they don’t pull focus while you read, but you can find them to click when you’re ready for them)
— and check out the links to other Related Content in each of the articles themselves —
Related articles right here on ADDandSoMuchMore.com
(in case you missed them above or below)
- Brain-based Coaching with Madelyn Griffith-Haynie
- Group Coaching Information LinkList
- Private Coaching Formats & Fees
- Confirmation Bias & The Tragedy of Certainty
- Change, Growth and Decision Dilemmas
- Yes AND vs. Yes but
- The Brain: Why much of what you think you know is WRONG
Other supports for this article
A Few LinkLists by Category (to articles here on ADDandSoMuchMore.com)
- The Walking A Mile in Another’s Shoes Series (you are NOT alone here!)
- The Articles of the What Kind of World do YOU Want? Series
- ABOUT The Brain-Transplant Series
Related Articles ’round the net
- Fighting Cognitive Dissonance & The Lies We Tell Ourselves
- Brain Science Podcast/Books & Ideas — Mistakes Were Made (But Not by Me):
Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
- The Backfire Effect
- OpEd & Book Review: Strangers In Their Own Land: Anger and Mourning on the American Right
- Psychology’s Treacherous Trio: Confirmation Bias, Cognitive Dissonance, and Motivated Reasoning
- How motivated skepticism strengthens incorrect beliefs
- Mind Hacks: Cognitive dissonance reduction
BY THE WAY: Since ADDandSoMuchMore.com is an Evergreen site, I revisit all my content periodically to update links — when you link back, like, follow or comment, you STAY on the page. When you do not, you run a high risk of getting replaced by a site with a more generous come-from.