Your home for discussing politics, the restored gospel of Jesus Christ, and the principles of liberty.
It's funny but true. At some point we have to go beyond knowing "how" to think, and actually let someone teach us. Perhaps we should let the scriptures teach us.My brain trying to remember skousen wrote: I was teaching at BYU and a student comes up to me and says "I hope you won't ask me to memorize lots of facts."
"Well, what would you like me to do?"
"I want you to teach me how to think."
I said "What with?"
Living a Lie: We Deceive Ourselves to Better Deceive Others
New research provides the first evidence for a theory first put forward in the 1970s
People mislead themselves all day long. We tell ourselves we’re smarter and better looking than our friends, that our political party can do no wrong, that we’re too busy to help a colleague. In 1976, in the foreword to Richard Dawkins’s The Selfish Gene, the biologist Robert Trivers floated a novel explanation for such self-serving biases: We dupe ourselves in order to deceive others, creating social advantage. Now after four decades Trivers and his colleagues have published the first research supporting his idea.
Psychologists have identified several ways of fooling ourselves: biased information-gathering, biased reasoning and biased recollections. The new work, forthcoming in the Journal of Economic Psychology, focuses on the first—the way we seek information that supports what we want to believe and avoid that which does not.
In one experiment Trivers and his team asked 306 online participants to write a persuasive speech about a fictional man named Mark. They were told they would receive a bonus depending on how effective it was. Some were told to present Mark as likable, others were instructed to depict him as unlikable, the remaining subjects were directed to convey whatever impression they formed. To gather information about Mark, the participants watched a series of short videos, which they could stop observing at any intermission. For some viewers, most of the early videos presented Mark in a good light (recycling, returning a wallet), and they grew gradually darker (catcalling, punching a friend). For others, the videos went from dark to light.
When incentivized to present Mark as likable, people who watched the likable videos first stopped watching sooner than those who saw unlikable videos first. The former did not wait for a complete picture as long as they got the information they needed to convince themselves, and others, of Mark’s goodness. In turn, their own opinions about Mark were more positive, which led their essays about his good nature to be more convincing, as rated by other participants. (A complementary process occurred for those paid to present Mark as bad.) “What’s so interesting is that we seem to intuitively understand that if we can get ourselves to believe something first, we’ll be more effective at getting others to believe it,” says William von Hippel, a psychologist at The University of Queensland, who co-authored the study. “So we process information in a biased fashion, we convince ourselves, and we convince others. The beauty is, those are the steps Trivers outlined—and they all lined up in one study.”
In real life you are not being paid to talk about Mark but you may be selling a used car or debating a tax policy or arguing for a promotion—cases in which you benefit not from gaining and presenting an accurate picture of reality but from convincing someone of a particular point of view.
One of the most common types of self-deception is self-enhancement. Psychologists have traditionally argued we evolved to overestimate our good qualities because it makes us feel good. But feeling good on its own has no bearing on survival or reproduction. Another assertion is self-enhancement boosts motivation, leading to greater accomplishment. But if motivation were the goal, then we would have just evolved to be more motivated, without the costs of reality distortion.
Trivers argues that a glowing self-view makes others see us in the same light, leading to mating and cooperative opportunities. Supporting this argument, Cameron Anderson, a psychologist at the University of California, Berkeley, showed in 2012 that overconfident people are seen as more competent and have higher social status. “I believe there is a good possibility that self-deception evolved for the purpose of other-deception,” Anderson says.
In another study, forthcoming in Social Psychological and Personality Science, von Hippel and collaborators tested all three arguments together, in a longitudinal fashion. Does overconfidence in one’s self increase mental health? Motivation? Popularity?
Tracking almost 1,000 Australian high school boys for two years, the researchers found that over time, overconfidence about one’s athleticism and intelligence predicted neither better mental health nor better athletic or academic performance. Yet athletic overconfidence did predict greater popularity over time, supporting the idea that self-deception begets social advantage. (Intellectual self-enhancement may not have boosted popularity, the authors suggest, because among the teenage boys, smarts may have mattered less than sports.)
Why did it take so long for experimental evidence for Trivers’ idea to emerge? In part, he says, because he is a theorist and did not test it until he met von Hippel. Other experimental psychologists didn’t test it because the theory was not well known in psychology, von Hippel and Anderson say. Further, they suggest, most psychologists saw self-esteem or motivation as reason enough for self-enhancement to evolve.
Hugo Mercier, a researcher at the Institute for Cognitive Sciences in France who was not involved in the new studies, is familiar with the theory but questions it. He believes that in the long run overconfidence may backfire. He and others also debate whether motivated biases can strictly be called self-deception. “The whole concept is misleading,” he says. It’s not as though there is one part of us deliberately fooling another part of us that is the “self.” Trivers, von Hippel and Anderson of course disagree with Mercier on self-deception’s functionality and terminology.
Von Hippel offers two pieces of wisdom regarding self-deception: “My Machiavellian advice is this is a tool that works,” he says. “If you need to convince somebody of something, if your career or social success depends on persuasion, then the first person who needs to be [convinced] is yourself.” On the defensive side, he says, whenever anyone tries to convince you of something, think about what might be motivating that person. Even if he is not lying to you, he may be deceiving both you and himself.
WHAT WOULD IT TAKE TO CHANGE YOUR MIND?
I’ve been writing about and teaching critical thinking for more than two decades. “Form beliefs on the basis of the evidence,” was my mantra, and I taught tens of thousands of students how to do just that. Why, then did people leave my classroom with the same preposterous beliefs as when they entered—from alternative medicine to alien abductions to Obama being a Muslim? Because I had been doing it wrong.
The problem is that everyone thinks they form their beliefs on the basis of evidence. That’s one of the issues, for example, with fake news. Whether it’s Facebook, Twitter, or just surfing Google, people read and share stories either that they want to believe or that comport with what they already believe—then they point to those stories as evidence for their beliefs. Beliefs are used as evidence for beliefs, with fake news just providing fodder.
Teaching people to formulate beliefs on the basis of evidence may, ironically, trap them in false views of reality. Doing so increases their confidence in the truth of a belief because they think they’re believing as good critical thinkers would, but they’re actually digging themselves into a cognitive sinkhole. The more intelligent one is, the deeper the hole. As Michael Shermer famously stated, “Smarter people are better at rationalizing bad ideas.” That is, smarter people are better at making inferences and using data to support their belief, independent of the truth of that belief.
What, then, can we skeptics do? Here’s my recommendation: Instead of telling people to form beliefs on the basis of evidence, encourage them to seek out something, anything, that could potentially undermine their confidence in a particular belief. (Not something that will, but something that could. Phrased this way it’s less threatening.) This makes thinking critical.
Here’s an example of how to accomplish that: Jessica believes Obama is a Muslim. Ask her, on a scale from 1–10, how confident she is in that belief. Once she’s articulated a number, say 9, ask her what evidence she could encounter that would undermine her confidence. For example, what would it take to lower her confidence from 9 to 8, or even 6? Ask her a few questions to help her clarify her thoughts, and then invite her to seek out that evidence.
Philosophers call this process “defeasibility”. Defeasibility basically refers to whether or not a belief is revisable. For example, as Muslims don’t drink alcohol, perhaps a picture of Obama drinking beer would lower her confidence from 9 to 8, or maybe videos over the last eight years of Obama praying at Saint John’s Church in DC would be more effective, lowering her confidence to a 6. Or maybe these wouldn’t budge her confidence. Maybe she’d have well-rehearsed, uncritical responses to these challenges.
This is exactly what happened in my Science and Pseudoscience class at Portland State University. A student insisted Obama was a Muslim. When I displayed a series of pictures of Obama drinking beer on the projector, he instantly and emphatically responded,“Those pictures are photoshopped!” I asked him, on a scale of 1–10, how sure he was. He responded 9.9. I then asked him if he’d like to write an extra-credit paper detailing how the claim that the pictures were photoshopped could be false.
This strategy is effective because asking the question, “What evidence would it take to change your mind?” creates openings or spaces in someone’s belief where they challenge themselves to reflect upon whether or not their confidence in a belief is justified. You’re not telling them anything. You’re simply asking questions. And every time you ask it’s another opportunity for people to reevaluate and revise their beliefs. Every claim can be viewed as such, an opportunity to habituate people to seek disconfirming evidence.
If we don’t place defeasibility front and center, we’re jeopardizing peoples’ epistemic situation by unwittingly helping them artificially inflate the confidence they place in their beliefs. We’re creating less humility because they’re convincing themselves they’re responsible believers and thus that their beliefs are more likely to be true. That’s the pedagogical solution. It’s the easy part.
The more difficult part is publicly saying, “I don’t know” when we’re asked a question and don’t know the answer. And more difficult still, admitting “I was wrong” when we make a mistake. These are skills worth practicing.
Critical thinking begins with the assumption that our beliefs could be in error, and if they are, that we will revise them accordingly. This is what it means to be humble. Contributing to a culture where humility is the norm begins with us. We can’t expect people to become critical thinkers until we admit our own beliefs or reasoning processes are sometimes wrong, and that there are some questions, particularly in our specialties, that we don’t know how to answer. Doing so should help people become better critical thinkers, far more than 1000 repetitions of “form beliefs on the basis of evidence” ever could.
Users browsing this forum: No registered users and 3 guests