Cognitive dissonance
Posted: Wed Jan 13, 2021 11:51 pm
We've discussed cognitive dissonance before, which can be described as holding conflicting beliefs, or believing a thing when evidence shows something else.
In the classic Fitzinger experiment, subjects, after completing a very boring task, were given either $0, $1, or $20 to lie to another person telling them how much fun the task was. They were then later asked how much fun the task really was. Most people think those that were paid more would have reported they liked it better, but they reported that they didn't really like it very much, with over all ratings statistically equal to the control group--those who were not paid to lie.
The idea is that those were were paid only a dollar to lie experienced cognitive dissonance--two competing beliefs in their head. One, the task was boring and I said it was fun. Two, why would I lie for a dollar. Ergo, the task must have not been so bad afterall. Those who were paid $20 didn't experience such dissonance because hey, it was boring, but I got $20 to say it wasn't.
This experiment has been replicated in other ways, showing that people often tend to double down, in a sense, when later information doesn't agree with an original belief. This idea has implications in religion as well. Example: I'm a rational and intelligent person who has a testimony of the first vision. New evidence shows there was multiple versions of the first vision, casting doubts on the veracity of what I've believed. Dissonance may occur. People tend to do one of three things. Consider the new information seriously and adapt/adjust belief. Rationalize or justify to make the new data fit old beliefs or vice versa. Double down, reject the new info flatly as false (i.e. I'm smart, what I believed is true, so anything that challenges the belief has to be false).
Festinger also studied people in a doomsday cult after their prophesied end of the world did not happen. These people obviously experienced cognitive dissonance. Those who were heavily invested in the cult, those that had left family and friends to join, became even deeper invested after the failed prophecy--believing that their prayers stopped the end of the world. Those who were less invested chalked it up to a dumb decision and went back to their old lives. We see this type of thing play out in politics as well--especially recently!
People also tend to seek information that they know well not create this uncomfortable dissonance, which then becomes confirmation bias.
I'm explaining in a rather cursory way, I know, but I want to get to this observation and question.
The effect of cognitive dissonance is often discontent, belligerence, and or avoidance. I've seen a lot of avoidance in my spouse regarding church. She doesn't want to quit. She doesn't want to talk about it. She wants to attend, but when we used to go for three hours, she left after one. I love to discuss religion and church stuff, and she typically does not. She does not want to hear about the things that embarrass the church, yet she also does not want to do all of the church things and observe all the commandments as typically observed by TBMs. She is clearly no longer TBM, but clearly conflicted in her belief. I think this might be explained by cognitive dissonance.
Has anyone else experienced similar stuff?
In the classic Fitzinger experiment, subjects, after completing a very boring task, were given either $0, $1, or $20 to lie to another person telling them how much fun the task was. They were then later asked how much fun the task really was. Most people think those that were paid more would have reported they liked it better, but they reported that they didn't really like it very much, with over all ratings statistically equal to the control group--those who were not paid to lie.
The idea is that those were were paid only a dollar to lie experienced cognitive dissonance--two competing beliefs in their head. One, the task was boring and I said it was fun. Two, why would I lie for a dollar. Ergo, the task must have not been so bad afterall. Those who were paid $20 didn't experience such dissonance because hey, it was boring, but I got $20 to say it wasn't.
This experiment has been replicated in other ways, showing that people often tend to double down, in a sense, when later information doesn't agree with an original belief. This idea has implications in religion as well. Example: I'm a rational and intelligent person who has a testimony of the first vision. New evidence shows there was multiple versions of the first vision, casting doubts on the veracity of what I've believed. Dissonance may occur. People tend to do one of three things. Consider the new information seriously and adapt/adjust belief. Rationalize or justify to make the new data fit old beliefs or vice versa. Double down, reject the new info flatly as false (i.e. I'm smart, what I believed is true, so anything that challenges the belief has to be false).
Festinger also studied people in a doomsday cult after their prophesied end of the world did not happen. These people obviously experienced cognitive dissonance. Those who were heavily invested in the cult, those that had left family and friends to join, became even deeper invested after the failed prophecy--believing that their prayers stopped the end of the world. Those who were less invested chalked it up to a dumb decision and went back to their old lives. We see this type of thing play out in politics as well--especially recently!
People also tend to seek information that they know well not create this uncomfortable dissonance, which then becomes confirmation bias.
I'm explaining in a rather cursory way, I know, but I want to get to this observation and question.
The effect of cognitive dissonance is often discontent, belligerence, and or avoidance. I've seen a lot of avoidance in my spouse regarding church. She doesn't want to quit. She doesn't want to talk about it. She wants to attend, but when we used to go for three hours, she left after one. I love to discuss religion and church stuff, and she typically does not. She does not want to hear about the things that embarrass the church, yet she also does not want to do all of the church things and observe all the commandments as typically observed by TBMs. She is clearly no longer TBM, but clearly conflicted in her belief. I think this might be explained by cognitive dissonance.
Has anyone else experienced similar stuff?