If you’ve never argued with a pigheaded, obstinate fool who
refuses to acknowledge scientific consensus, you’ve never really lived. Or
maybe, like me, you know somebody who was convinced the apocalypse was going to
happen, and then when it didn’t happen, postponed it. No? Just me?
I thought of that because one of the articles I read about
this phenomenon cited the Seekers,
a cult of which I have written before, as an extreme example of folks
clinging to their beliefs in the face of undeniable evidence to the contrary.
In case you forgot, they believed that the Earth was going to be destroyed by
flooding on December 21, 1954, but that an alien spaceship was going to rapture
them and save them from the end of days. When the apocalypse didn’t happen
and the aliens didn’t come, the cult members rationalized that they had saved
the world at the last minute through the strength of their belief, and I was
like “Wait a minute, I’ve heard this one before.”
So why, and how, do these people ignore and/or rationalize
hard evidence? Like everything else in life, it has to do with feelings. When
we first encounter new information or a new conspiracy theory, our emotional response
to learning that officials on all levels of government have been replaced with
lizards wearing human suits occurs so quickly that we don’t have time to think
about it rationally. We’ll decide whether or not Michelle Obama is a
reptilian based on how we feel about it, and then later, we’ll think of what
sounds like a rational argument to support it. If it’s an argument that no one
can really prove or disprove, like regarding the existence of God, for example,
so much the better.
Now, imagine someone comes along and says, “You’re being
ridiculous, reptilians aren’t real, etc.”
“Well, even if they aren’t real, it can still be my opinion
that they’re real, even if they’re not,” you say.
“Your opinion is wrong.”
“Opinions can’t be
wrong."
![]() |
This is what happens inside my head every time someone says opinions can't be wrong. |
That happens because, according
to Arthur Lupia at the University of Michigan, we react to information that
feels emotionally threatening as if it were a real threat, like a tiger or
something. Of course, it’s not a tiger, but we retreat from it anyway, even if
that means shutting down the conversation.
Of course, verifiable scientific facts are different, right?
Of course they aren’t, go crawl back under your rock. People, unsurprisingly,
decide whether or not a scientist is credible based on how much they agree with
what he or she has to say. FFS. Since scientists never agree with each other
(Two percent of scientists don’t believe in evolution. Who are these people?),
it’s easy enough for people on both sides of an issue to decide that their scientists
are right and the other side’s scientists are wrong and wait a minute, I’ve
heard this one before.
So that’s why 88 percent of scientists believe that GMOs are
safe, but only 37 percent of the public does, for example. Something about GMOs
threatens people, for some reason, and some, but not all, are using the
opinions of the other 12 percent of scientists to back them up.
That’s not to say that people might not be compelled to
change their minds about things. It makes sense that people who don’t have a
particularly strong emotional attachment to an issue are more amenable to
changing their minds about it. Some folks will also relinquish their most cherished beliefs to ally themselves with other members of their
social group. Dartmouth professor Brendan Nyhan recently
published research that suggests that thinking about a time when you felt
good about yourself can help you come to a more accurate understanding of a
loaded political issue. So, remember that the next time you get into an
argument.
Put down the stick and think about the time you won the third-grade science fair. |