User Tag List

123 Last

Results 1 to 10 of 30

  1. #1
    Symbolic Herald Vasilisa's Avatar
    Join Date
    Feb 2010
    Posts
    4,128

    Post I Don't Want to be Right

    I Don't Want to be Right
    Why do people persist in believing things that just aren't true?
    May 19, 2014
    By Maria Konnikova
    The New Yorker

    Excerpt:
    Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.

    The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.

    Nyhan’s interest in false beliefs dates back to early 2000, when he was a senior at Swarthmore. It was the middle of a messy Presidential campaign, and he was studying the intricacies of political science. “The 2000 campaign was something of a fact-free zone,” he said. Along with two classmates, Nyhan decided to try to create a forum dedicated to debunking political lies. The result was Spinsanity, a fact-checking site that presaged venues like PolitiFact and the Annenberg Policy Center’s factcheck.org. For four years, the trio plugged along. Their work was popular—it was syndicated by Salon and the PhiladelphiaInquirer, and it led to a best-selling book—but the errors persisted. And so Nyhan, who had already enrolled in a doctorate program in political science at Duke, left Spinsanity behind to focus on what he now sees as the more pressing issue: If factual correction is ineffective, how can you make people change their misperceptions? The 2014 vaccine study was part of a series of experiments designed to answer the question.

    Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research. In the past few years, Nyhan has tried to address this gap by using real-life scenarios and news in his studies: the controversy surrounding weapons of mass destruction in Iraq, the questioning of Obama’s birth certificate, and anti-G.M.O. activism. Traditional work in this area has focussed on fictional stories told in laboratory settings, but Nyhan believes that looking at real debates is the best way to learn how persistently incorrect views of the world can be corrected.

    One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.

    But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

    In those scenarios, attempts at correction can indeed be tricky. In a study from 2013, Kelly Garrett and Brian Weeks looked to see if political misinformation—specifically, details about who is and is not allowed to access your electronic health records—that was corrected immediately would be any less resilient than information that was allowed to go uncontested for a while. At first, it appeared as though the correction did cause some people to change their false beliefs. But, when the researchers took a closer look, they found that the only people who had changed their views were those who were ideologically predisposed to disbelieve the fact in question. If someone held a contrary attitude, the correction not only didn’t work—it made the subject more distrustful of the source. A climate-change study from 2012 found a similar effect. Strong partisanship affected how a story about climate change was processed, even if the story was apolitical in nature, such as an article about possible health ramifications from a disease like the West Nile Virus, a potential side effect of change. If information doesn’t square with someone’s prior beliefs, he discards the beliefs if they’re weak and discards the information if the beliefs are strong.

    Even when we think we’ve properly corrected a false belief, the original exposure often continues to influence our memory and thoughts. In a series of studies, Lewandowsky and his colleagues at the University of Western Australia asked university students to read the report of a liquor robbery that had ostensibly taken place in Australia’s Northern Territory. Everyone read the same report, but in some cases racial information about the perpetrators was included and in others it wasn’t. In one scenario, the students were led to believe that the suspects were Caucasian, and in another that they were Aboriginal. At the end of the report, the racial information either was or wasn’t retracted. Participants were then asked to take part in an unrelated computer task for half an hour. After that, they were asked a number of factual questions (“What sort of car was found abandoned?”) and inference questions (“Who do you think the attackers were?”). After the students answered all of the questions, they were given a scale to assess their racial attitudes toward Aboriginals.

    Everyone’s memory worked correctly: the students could all recall the details of the crime and could report precisely what information was or wasn’t retracted. But the students who scored highest on racial prejudice continued to rely on the racial misinformation that identified the perpetrators as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately, stating that the information about race was false, and yet they still relied on race in their inference responses, saying that the attackers were likely Aboriginal or that the store owner likely had trouble understanding them because they were Aboriginal. This was, in other words, a laboratory case of the very dynamic that Nyhan identified: strongly held beliefs continued to influence judgment, despite correction attempts—even with a supposedly conscious awareness of what was happening.

    In a follow-up, Lewandowsky presented a scenario that was similar to the original experiment, except now, the Aboriginal was a hero who disarmed the would-be robber. This time, it was students who had scored lowest in racial prejudice who persisted in their reliance on false information, in spite of any attempt at correction. In their subsequent recollections, they mentioned race more frequently, and incorrectly, even though they knew that piece of information had been retracted. False beliefs, it turns out, have little to do with one’s stated political affiliations and far more to do with self-identity: What kind of person am I, and what kind of person do I want to be? All ideologies are similarly affected.

    It’s the realization that persistently false beliefs stem from issues closely tied to our conception of self that prompted Nyhan and his colleagues to look at less traditional methods of rectifying misinformation. Rather than correcting or augmenting facts, they decided to target people’s beliefs about themselves. In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior. For example, when women are asked to state their gender before taking a math or science test, they end up performing worse than if no such statement appears, conforming their behavior to societal beliefs about female math-and-science ability. To address this so-called stereotype threat, Steele proposes an exercise in self-affirmation: either write down or say aloud positive moments from your past that reaffirm your sense of self and are related to the threat in question. Steele’s research suggests that affirmation makes people far more resilient and high performing, be it on an S.A.T., an I.Q. test, or at a book-club meeting.

    Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.

    Still, as Nyhan is the first to admit, it’s hardly a solution that can be applied easily outside the lab. “People don’t just go around writing essays about a time they felt good about themselves,” he said. And who knows how long the effect lasts—it’s not as though we often think good thoughts and then go on to debate climate change.

    But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted. Instead, why not focus on presenting issues in a way keeps broader notions out of it—messages that are not political, not ideological, not in any way a reflection of who you are?

    < Full Story >

    the formless thing which gives things form!
    Found Forum Haiku Project


    Positive Spin | your feedback welcomed | Darker Criticism

  2. #2
    Senior Member burymecloser's Avatar
    Join Date
    Jan 2010
    MBTI
    INTP
    Enneagram
    6w5
    Posts
    514

    Default

    A few of the examples cited I'd heard about before, but it's surprising - and upsetting - how far this goes.

    The overall conclusion is clear, but I'd be interested to know how many people did change their attitudes in the face of factual evidence to the contrary. Reading this, I assumed I would be open to new information, but I guess we can't know. Would be curious if there's any Fi/Ti disparity at play.

  3. #3
    Senior Member
    Join Date
    Mar 2014
    MBTI
    ISTP
    Enneagram
    ~8 sx/sp
    Socionics
    SLE
    Posts
    565

    Default

    Quote Originally Posted by burymecloser View Post
    A few of the examples cited I'd heard about before, but it's surprising - and upsetting - how far this goes.

    The overall conclusion is clear, but I'd be interested to know how many people did change their attitudes in the face of factual evidence to the contrary. Reading this, I assumed I would be open to new information, but I guess we can't know. Would be curious if there's any Fi/Ti disparity at play.
    Yeah pretty upsetting.

    As for Fi/Ti blahblah here's a datapoint from me. It actually happened before that I was presented with factual information to the contrary. By factual evidence I don't mean someone else's personal views, but hard data, e.g. numbers of actual measurements of stuff. I could not choose to ignore any of it. It just won't go away from my consciousness. It does have a profound effect on my opinion when I'm presented with such things.

    When it's subjective opinion, interpretation blahblahblah, it's not hard evidence so it takes much more to change my opinion. Some stuff e.g. pointing out an inconsistency - actual one, not just wordplay - does still help in these cases.

  4. #4
    LL P. Stewie Beorn's Avatar
    Join Date
    Dec 2008
    Posts
    4,806

    Default

    I have to say that I was immediately put off by the title of the article. It strikes me as a bit condescending and arrogant to presume that people are somehow willfully ignorant because they disagree on some issues. I don't doubt that there are people who are willfully ignorant and can't be reasoned with, but I don't believe the examples given necessitate that. Moreover, the fact that this guy was engaged in trying to establish a political fact checking group turns me off as well. There is no place where the truth is in between the lines more than in politics. There may be hard facts involved, but even simply stating facts betrays bias. What facts matter? How does this fact fit into how I assess a candidate, party, or policy? How does this fact compare to that fact? The reality is that people who want to seen as dispensers of truth and facts are seeking a power that will enable them to manipulate and control what people think.

    "But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted."

    I think this statement makes it most clear that the author desires to just deposit knowledge in people's brains. That's just not how knowledge works and that's a good thing. I don't want people to be robots and i don't want people to unthinkingly assent to whatever someone in a lab coat tells them.

    I do believe in absolute truth and make a big deal out of that often here. However, the existence of absolute truth and access to that truth are two separate issues. The author seems to think that scientific facts handed down by the CDC are self evident which I think is hardly the case. I believe that the reality is that the best way to access truth is through indirect communication. You can see this in the way that Jesus relied on parables and the way that Plato and Socrates relied on dialectical conversation. Indirect communication actually involves the participant in an active way. It immediately engenders trust in the participant as it's success is reliant on the participant's own reasoning ability. Direct communication is in many ways inferior. It immediately creates a dichotomy between the authority and the recipient who is expected to passively accept the knowledge. Indeed every direct communication begins with an appeal to authority.

    That is what the author misses when he's examining the results of the study on anti-vaxers. There was a common denominator between all the efforts to persuade. Maybe some rejected each method because they were just hard-headed, but my guess is that their minds turned off as soon as they realized the information was provided by the CDC. They rejected the authority of the CDC. Is that reasonable? Probably not. I think it's obvious that this stems from a broader distrust in government. So whereas these guys seem obsessed with figuring out a way to deposit knowledge in people's minds I'm more concerned with creating a culture where people are encouraged to think critically and fixing a very untrustworthy government.

    Finally, I don't give a shit what the CDC says about raw milk. It's delicious and well worth whatever risk I'm taking on and the same goes for over easy eggs and rare steak.

  5. #5
    Senior Member burymecloser's Avatar
    Join Date
    Jan 2010
    MBTI
    INTP
    Enneagram
    6w5
    Posts
    514

    Default

    Quote Originally Posted by Beorn View Post
    I have to say that I was immediately put off by the title of the article. It strikes me as a bit condescending and arrogant to presume that people are somehow willfully ignorant because they disagree on some issues.
    This isn't disagreement on matters of opinion; people were explicitly shown that 2+2=4, and they chose to believe that 2+2=5. They preferred to be wrong than to change their beliefs.

    Quote Originally Posted by Beorn
    That is what the author misses when he's examining the results of the study on anti-vaxers. There was a common denominator between all the efforts to persuade. Maybe some rejected each method because they were just hard-headed, but my guess is that their minds turned off as soon as they realized the information was provided by the CDC.
    They surveyed 2,000 people. Let's be real; most of those people had never heard of the CDC. And those surveyed didn't just ignore the information presented to them, they doubled down on what they already believed:
    It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines.
    And the vaccine issue was not an isolated example; it's representative of a broader pattern. You may have missed the point of the article.
    i just want to be a sweetheart

  6. #6
    LL P. Stewie Beorn's Avatar
    Join Date
    Dec 2008
    Posts
    4,806

    Default

    Quote Originally Posted by burymecloser View Post
    You may have missed the point of the article.
    I think you missed the point of my argument.

  7. #7
    WhoCares
    Guest

    Default

    So the Law Of Attraction is real. Thoughts do create our reality, just not in the
    way New ageists believe.

  8. #8
    deplorable basketcase Tellenbach's Avatar
    Join Date
    Oct 2013
    MBTI
    ISTJ
    Enneagram
    6w5
    Posts
    3,953

    Default

    a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism
    1. One can't discount the power of the anecdote. Anecdotes are considered valid evidence by many people including myself.
    2. A scientific study finding no correlation between vaccines and autism at a 0.05 confidence interval may find significance at a 0.10 confidence interval.
    3. Scientific studies frequently come up with contradictory results. One week, coffee is good for you and the next week it's not.
    4. Government lies to us frequently and maintaining a healthy level of skepticism is a good thing.

    I also don't trust lefty academics to define what an "incorrect view" or a "false belief" is.
    Senator Rand Paul is alive because of modern medicine and because his attacker punches like a girl.

  9. #9
    I could do things Hard's Avatar
    Join Date
    Jan 2014
    MBTI
    ENFJ
    Enneagram
    1w2 sp/so
    Socionics
    EIE Fe
    Posts
    7,987

    Default

    Quote Originally Posted by Tellenbach View Post
    1. One can't discount the power of the anecdote. Anecdotes are considered valid evidence by many people including myself.
    2. A scientific study finding no correlation between vaccines and autism at a 0.05 confidence interval may find significance at a 0.10 confidence interval.
    3. Scientific studies frequently come up with contradictory results. One week, coffee is good for you and the next week it's not.
    4. Government lies to us frequently and maintaining a healthy level of skepticism is a good thing.

    I also don't trust lefty academics to define what an "incorrect view" or a "false belief" is.
    So essentially, you trust pretty much nothing? I have the same issue with my mother and ask her the same question; what do you trust?
    MBTI: ExxJ tetramer
    Functions: Fe > Te > Ni > Se > Si > Ti > Fi > Ne
    Enneagram: 1w2 - 3w4 - 6w5 (The Taskmaster) | sp/so
    Socionics: β-E dimer | -
    Big 5: slOaI
    Temperament: Choleric/Melancholic
    Alignment: Lawful Neutral
    External Perception: Nohari and Johari


  10. #10
    deplorable basketcase Tellenbach's Avatar
    Join Date
    Oct 2013
    MBTI
    ISTJ
    Enneagram
    6w5
    Posts
    3,953

    Default

    @Hard I trust in my judgement, which is impeccable. I trust in results. If a doctor tells me he's treated 100 schizophrenics successfully using niacin, that would have an impact, especially if I learn that the patients have corroborated that story. I trust in Amazon product reviews and yelp restaurant reviews. Sure you get burned once in a while but for the most part, they haven't failed me. I trust in my ability to follow the logic from policy to outcome. I can usually tell what the outcome of a policy decision will be because I understand human nature.
    Senator Rand Paul is alive because of modern medicine and because his attacker punches like a girl.

Similar Threads

  1. [INTP] INTPs and always wanting to be right. Normal?
    By Scorquendo in forum The NT Rationale (ENTP, INTP, ENTJ, INTJ)
    Replies: 25
    Last Post: 10-19-2015, 08:33 AM
  2. [E4] I'm a really special 4w5; I don't want to be special.
    By Tiltyred in forum Enneatypes
    Replies: 37
    Last Post: 08-25-2012, 05:12 PM
  3. Don't want to be spied on?
    By Rail Tracer in forum Politics, History, and Current Events
    Replies: 3
    Last Post: 05-29-2012, 02:24 AM
  4. [ENTJ] ENTJ's: Why don't you want to be my friend??
    By theniteshadow3 in forum The NT Rationale (ENTP, INTP, ENTJ, INTJ)
    Replies: 41
    Last Post: 03-15-2011, 11:19 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Single Sign On provided by vBSSO