Becoming Self Aware: Critical Thinking

How can we improve the forum? Need help with something on the forum?
Post Reply
User avatar
Joel
Level 34 Illuminated
Posts: 7043

Becoming Self Aware: Critical Thinking

Post by Joel »

Just thought this might help YMWHISTLE

https://yourlogicalfallacyis.com/







Last edited by Joel on April 9th, 2017, 11:56 am, edited 1 time in total.

KMCopeland
captain of 1,000
Posts: 2279
Location: The American South

Re: Becoming Self Aware: Critical Thinking

Post by KMCopeland »

It absolutely could help.

braingrunt
captain of 1,000
Posts: 2042

Re: Becoming Self Aware: Critical Thinking

Post by braingrunt »

A quote from Cleon Skousen comes to mind (this will be a paraphrase from memory):
My brain trying to remember skousen wrote: I was teaching at BYU and a student comes up to me and says "I hope you won't ask me to memorize lots of facts."
"Well, what would you like me to do?"
"I want you to teach me how to think."
I said "What with?"
It's funny but true. At some point we have to go beyond knowing "how" to think, and actually let someone teach us. Perhaps we should let the scriptures teach us.


User avatar
Joel
Level 34 Illuminated
Posts: 7043

Re: Becoming Self Aware: Critical Thinking

Post by Joel »


User avatar
Joel
Level 34 Illuminated
Posts: 7043

Re: Becoming Self Aware: Critical Thinking

Post by Joel »

The Fine Art of Baloney Detection

http://www.inf.fu-berlin.de/lehre/pmo/e ... aloney.pdf" onclick="window.open(this.href);return false;

User avatar
Joel
Level 34 Illuminated
Posts: 7043

The Illusion of Truth

Post by Joel »

If you repeat something enough times, it comes to feel good and true.

User avatar
Joel
Level 34 Illuminated
Posts: 7043

Living a Lie: We Deceive Ourselves to Better Deceive Others

Post by Joel »

Living a Lie: We Deceive Ourselves to Better Deceive Others

New research provides the first evidence for a theory first put forward in the 1970s

People mislead themselves all day long. We tell ourselves we’re smarter and better looking than our friends, that our political party can do no wrong, that we’re too busy to help a colleague. In 1976, in the foreword to Richard Dawkins’s The Selfish Gene, the biologist Robert Trivers floated a novel explanation for such self-serving biases: We dupe ourselves in order to deceive others, creating social advantage. Now after four decades Trivers and his colleagues have published the first research supporting his idea.

Psychologists have identified several ways of fooling ourselves: biased information-gathering, biased reasoning and biased recollections. The new work, forthcoming in the Journal of Economic Psychology, focuses on the first—the way we seek information that supports what we want to believe and avoid that which does not.

In one experiment Trivers and his team asked 306 online participants to write a persuasive speech about a fictional man named Mark. They were told they would receive a bonus depending on how effective it was. Some were told to present Mark as likable, others were instructed to depict him as unlikable, the remaining subjects were directed to convey whatever impression they formed. To gather information about Mark, the participants watched a series of short videos, which they could stop observing at any intermission. For some viewers, most of the early videos presented Mark in a good light (recycling, returning a wallet), and they grew gradually darker (catcalling, punching a friend). For others, the videos went from dark to light.

When incentivized to present Mark as likable, people who watched the likable videos first stopped watching sooner than those who saw unlikable videos first. The former did not wait for a complete picture as long as they got the information they needed to convince themselves, and others, of Mark’s goodness. In turn, their own opinions about Mark were more positive, which led their essays about his good nature to be more convincing, as rated by other participants. (A complementary process occurred for those paid to present Mark as bad.) “What’s so interesting is that we seem to intuitively understand that if we can get ourselves to believe something first, we’ll be more effective at getting others to believe it,” says William von Hippel, a psychologist at The University of Queensland, who co-authored the study. “So we process information in a biased fashion, we convince ourselves, and we convince others. The beauty is, those are the steps Trivers outlined—and they all lined up in one study.”

In real life you are not being paid to talk about Mark but you may be selling a used car or debating a tax policy or arguing for a promotion—cases in which you benefit not from gaining and presenting an accurate picture of reality but from convincing someone of a particular point of view.

One of the most common types of self-deception is self-enhancement. Psychologists have traditionally argued we evolved to overestimate our good qualities because it makes us feel good. But feeling good on its own has no bearing on survival or reproduction. Another assertion is self-enhancement boosts motivation, leading to greater accomplishment. But if motivation were the goal, then we would have just evolved to be more motivated, without the costs of reality distortion.

Trivers argues that a glowing self-view makes others see us in the same light, leading to mating and cooperative opportunities. Supporting this argument, Cameron Anderson, a psychologist at the University of California, Berkeley, showed in 2012 that overconfident people are seen as more competent and have higher social status. “I believe there is a good possibility that self-deception evolved for the purpose of other-deception,” Anderson says.

In another study, forthcoming in Social Psychological and Personality Science, von Hippel and collaborators tested all three arguments together, in a longitudinal fashion. Does overconfidence in one’s self increase mental health? Motivation? Popularity?

Tracking almost 1,000 Australian high school boys for two years, the researchers found that over time, overconfidence about one’s athleticism and intelligence predicted neither better mental health nor better athletic or academic performance. Yet athletic overconfidence did predict greater popularity over time, supporting the idea that self-deception begets social advantage. (Intellectual self-enhancement may not have boosted popularity, the authors suggest, because among the teenage boys, smarts may have mattered less than sports.)

Why did it take so long for experimental evidence for Trivers’ idea to emerge? In part, he says, because he is a theorist and did not test it until he met von Hippel. Other experimental psychologists didn’t test it because the theory was not well known in psychology, von Hippel and Anderson say. Further, they suggest, most psychologists saw self-esteem or motivation as reason enough for self-enhancement to evolve.

Hugo Mercier, a researcher at the Institute for Cognitive Sciences in France who was not involved in the new studies, is familiar with the theory but questions it. He believes that in the long run overconfidence may backfire. He and others also debate whether motivated biases can strictly be called self-deception. “The whole concept is misleading,” he says. It’s not as though there is one part of us deliberately fooling another part of us that is the “self.” Trivers, von Hippel and Anderson of course disagree with Mercier on self-deception’s functionality and terminology.

Von Hippel offers two pieces of wisdom regarding self-deception: “My Machiavellian advice is this is a tool that works,” he says. “If you need to convince somebody of something, if your career or social success depends on persuasion, then the first person who needs to be [convinced] is yourself.” On the defensive side, he says, whenever anyone tries to convince you of something, think about what might be motivating that person. Even if he is not lying to you, he may be deceiving both you and himself.

User avatar
Joel
Level 34 Illuminated
Posts: 7043

Re: Becoming Self Aware: Critical Thinking

Post by Joel »

WHAT WOULD IT TAKE TO CHANGE YOUR MIND?

I’ve been writing about and teaching critical thinking for more than two decades. “Form beliefs on the basis of the evidence,” was my mantra, and I taught tens of thousands of students how to do just that. Why, then did people leave my classroom with the same preposterous beliefs as when they entered—from alternative medicine to alien abductions to Obama being a Muslim? Because I had been doing it wrong.

The problem is that everyone thinks they form their beliefs on the basis of evidence. That’s one of the issues, for example, with fake news. Whether it’s Facebook, Twitter, or just surfing Google, people read and share stories either that they want to believe or that comport with what they already believe—then they point to those stories as evidence for their beliefs. Beliefs are used as evidence for beliefs, with fake news just providing fodder.

Teaching people to formulate beliefs on the basis of evidence may, ironically, trap them in false views of reality. Doing so increases their confidence in the truth of a belief because they think they’re believing as good critical thinkers would, but they’re actually digging themselves into a cognitive sinkhole. The more intelligent one is, the deeper the hole. As Michael Shermer famously stated, “Smarter people are better at rationalizing bad ideas.” That is, smarter people are better at making inferences and using data to support their belief, independent of the truth of that belief.

What, then, can we skeptics do? Here’s my recommendation: Instead of telling people to form beliefs on the basis of evidence, encourage them to seek out something, anything, that could potentially undermine their confidence in a particular belief. (Not something that will, but something that could. Phrased this way it’s less threatening.) This makes thinking critical.

Here’s an example of how to accomplish that: Jessica believes Obama is a Muslim. Ask her, on a scale from 1–10, how confident she is in that belief. Once she’s articulated a number, say 9, ask her what evidence she could encounter that would undermine her confidence. For example, what would it take to lower her confidence from 9 to 8, or even 6? Ask her a few questions to help her clarify her thoughts, and then invite her to seek out that evidence.

Philosophers call this process “defeasibility”. Defeasibility basically refers to whether or not a belief is revisable. For example, as Muslims don’t drink alcohol, perhaps a picture of Obama drinking beer would lower her confidence from 9 to 8, or maybe videos over the last eight years of Obama praying at Saint John’s Church in DC would be more effective, lowering her confidence to a 6. Or maybe these wouldn’t budge her confidence. Maybe she’d have well-rehearsed, uncritical responses to these challenges.

This is exactly what happened in my Science and Pseudoscience class at Portland State University. A student insisted Obama was a Muslim. When I displayed a series of pictures of Obama drinking beer on the projector, he instantly and emphatically responded,“Those pictures are photoshopped!” I asked him, on a scale of 1–10, how sure he was. He responded 9.9. I then asked him if he’d like to write an extra-credit paper detailing how the claim that the pictures were photoshopped could be false.

This strategy is effective because asking the question, “What evidence would it take to change your mind?” creates openings or spaces in someone’s belief where they challenge themselves to reflect upon whether or not their confidence in a belief is justified. You’re not telling them anything. You’re simply asking questions. And every time you ask it’s another opportunity for people to reevaluate and revise their beliefs. Every claim can be viewed as such, an opportunity to habituate people to seek disconfirming evidence.

If we don’t place defeasibility front and center, we’re jeopardizing peoples’ epistemic situation by unwittingly helping them artificially inflate the confidence they place in their beliefs. We’re creating less humility because they’re convincing themselves they’re responsible believers and thus that their beliefs are more likely to be true. That’s the pedagogical solution. It’s the easy part.

The more difficult part is publicly saying, “I don’t know” when we’re asked a question and don’t know the answer. And more difficult still, admitting “I was wrong” when we make a mistake. These are skills worth practicing.

Critical thinking begins with the assumption that our beliefs could be in error, and if they are, that we will revise them accordingly. This is what it means to be humble. Contributing to a culture where humility is the norm begins with us. We can’t expect people to become critical thinkers until we admit our own beliefs or reasoning processes are sometimes wrong, and that there are some questions, particularly in our specialties, that we don’t know how to answer. Doing so should help people become better critical thinkers, far more than 1000 repetitions of “form beliefs on the basis of evidence” ever could.

User avatar
Joel
Level 34 Illuminated
Posts: 7043

Why incompetent people think they're amazing - David Dunning

Post by Joel »


https://ed.ted.com/lessons/why-incompet ... #digdeeper


The Dunning-Kruger effect, at its core, suggests that people fail to recognize their intellectual and social shortcomings because they simply lack the expertise necessary to see them. As such, the effect reflects a double-curse: People’s deficits cause them to make many mistakes, and then those exact same deficits prevent them from seeing their decisions as mistakes. As a consequence, the pervasive tendency for people to overrate themselves and their talents is not necessarily due to their ego, but rather to intellectual deficits that they cannot see. We all share this problem, in that we all have pockets of incompetence that remain invisible to us.

Over the years, the Dunning-Kruger effect and its implications have received a fair amount of attention in the popular press, even including a Doonesbury cartoon.

A long-form essay about the effect by one of psychologists who described originally it can be found at Pacific Standard, focusing particularly on the role of false beliefs in giving people undeserved confidence. Award-winning film documentarian Errol Morris published a 5-part essay musing about the effect and its impact on American history in his blog at the New York Times.

Elsewhere are several radio broadcasts and podcasts that deep into the effect in detail. A short treatment of the effect can be found on a 2016 episode of This American Life.

Longer discussions about the effect can be found at You Are Not So Smart. A wide-ranging radio program linking it to other psychological phenomena can be found at Ideas.

Recently, the phenomenon has been linked to the rejection of experts (BBC’s The Human Zoo) and to the field of agnotology, which explores how financial and political interests spread ignorance and doubt in the public. The classic historical example is how tobacco companies obscured the fact that their product caused lung cancer. To tie the effect to the business world, one can visit this lively video podcast A Sales Guy.

The original scholarly article by Kruger & Dunning that introduced the effect can be found here. A longer article reviewing the science about flawed self-judgment in general, covering implications for health, education, and the workplace, can be located here. A longer scientific review article focusing solely on research about the effect is here.

A book covering the psychological science on faulty self-judgment in general is Self-insight: Roadblocks and detours on the path to knowing thyself.

Over the years, several authors have written blog essays about the effect and their particular area of interest (e.g., medicine, politics, education, scuba diving, aviation, creative writing, the list goes on). One never knows what one might hit by googling “Dunning Kruger” and one’s favorite activity.

User avatar
Joel
Level 34 Illuminated
Posts: 7043

Discovering Truth - By Challenging It!

Post by Joel »


User avatar
Joel
Level 34 Illuminated
Posts: 7043

Ignorance Will Always Grow Faster Than Knowledge

Post by Joel »


Patriot16
captain of 100
Posts: 209

Re: Becoming Self Aware: Critical Thinking

Post by Patriot16 »

Joel wrote: October 26th, 2014, 10:03 am Just thought this might help YMWHIT

With all the conspiracy theories and Russian-sourced material on this forum, I thought critical thinking was forbidden. I still do.

Patriot16
captain of 100
Posts: 209

Re: Becoming Self Aware: Critical Thinking

Post by Patriot16 »

Just the sight of the phrase "critical thinking" in this forum made me spew coffee out my nose.

Patriot16

User avatar
Joel
Level 34 Illuminated
Posts: 7043

Re: Becoming Self Aware: Critical Thinking

Post by Joel »

Patriot16 wrote: January 24th, 2018, 9:46 pm Just the sight of the phrase "critical thinking" in this forum made me spew coffee out my nose.

Patriot16
:lol:


User avatar
Thinker
Level 34 Illuminated
Posts: 12975
Location: The Universe - wherever that is.

Re: Becoming Self Aware: Critical Thinking

Post by Thinker »

Good stuff, thanks!

I made something like this and taught my kids too... (from op 1st clip)

Image

Post Reply