• You are currently viewing our forum as a guest, which gives you limited access to view most discussions and access our other features. By joining our free community, you will have access to additional post topics, communicate privately with other members (PM), view blogs, respond to polls, upload content, and access many other special features. Registration is fast, simple and absolutely free, so please join our community today! Just click here to register. You should turn your Ad Blocker off for this site or certain features may not work properly. If you have any problems with the registration process or your account login, please contact us by clicking here.

I Don't Want to be Right

Vasilisa

Symbolic Herald
Joined
Feb 2, 2010
Messages
3,946
Instinctual Variant
so/sx
I Don't Want to be Right
Why do people persist in believing things that just aren't true?
May 19, 2014
By Maria Konnikova
The New Yorker

Excerpt:
Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.

The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.

Nyhan’s interest in false beliefs dates back to early 2000, when he was a senior at Swarthmore. It was the middle of a messy Presidential campaign, and he was studying the intricacies of political science. “The 2000 campaign was something of a fact-free zone,” he said. Along with two classmates, Nyhan decided to try to create a forum dedicated to debunking political lies. The result was Spinsanity, a fact-checking site that presaged venues like PolitiFact and the Annenberg Policy Center’s factcheck.org. For four years, the trio plugged along. Their work was popular—it was syndicated by Salon and the PhiladelphiaInquirer, and it led to a best-selling book—but the errors persisted. And so Nyhan, who had already enrolled in a doctorate program in political science at Duke, left Spinsanity behind to focus on what he now sees as the more pressing issue: If factual correction is ineffective, how can you make people change their misperceptions? The 2014 vaccine study was part of a series of experiments designed to answer the question.

Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research. In the past few years, Nyhan has tried to address this gap by using real-life scenarios and news in his studies: the controversy surrounding weapons of mass destruction in Iraq, the questioning of Obama’s birth certificate, and anti-G.M.O. activism. Traditional work in this area has focussed on fictional stories told in laboratory settings, but Nyhan believes that looking at real debates is the best way to learn how persistently incorrect views of the world can be corrected.

One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.

But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

In those scenarios, attempts at correction can indeed be tricky. In a study from 2013, Kelly Garrett and Brian Weeks looked to see if political misinformation—specifically, details about who is and is not allowed to access your electronic health records—that was corrected immediately would be any less resilient than information that was allowed to go uncontested for a while. At first, it appeared as though the correction did cause some people to change their false beliefs. But, when the researchers took a closer look, they found that the only people who had changed their views were those who were ideologically predisposed to disbelieve the fact in question. If someone held a contrary attitude, the correction not only didn’t work—it made the subject more distrustful of the source. A climate-change study from 2012 found a similar effect. Strong partisanship affected how a story about climate change was processed, even if the story was apolitical in nature, such as an article about possible health ramifications from a disease like the West Nile Virus, a potential side effect of change. If information doesn’t square with someone’s prior beliefs, he discards the beliefs if they’re weak and discards the information if the beliefs are strong.

Even when we think we’ve properly corrected a false belief, the original exposure often continues to influence our memory and thoughts. In a series of studies, Lewandowsky and his colleagues at the University of Western Australia asked university students to read the report of a liquor robbery that had ostensibly taken place in Australia’s Northern Territory. Everyone read the same report, but in some cases racial information about the perpetrators was included and in others it wasn’t. In one scenario, the students were led to believe that the suspects were Caucasian, and in another that they were Aboriginal. At the end of the report, the racial information either was or wasn’t retracted. Participants were then asked to take part in an unrelated computer task for half an hour. After that, they were asked a number of factual questions (“What sort of car was found abandoned?”) and inference questions (“Who do you think the attackers were?”). After the students answered all of the questions, they were given a scale to assess their racial attitudes toward Aboriginals.

Everyone’s memory worked correctly: the students could all recall the details of the crime and could report precisely what information was or wasn’t retracted. But the students who scored highest on racial prejudice continued to rely on the racial misinformation that identified the perpetrators as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately, stating that the information about race was false, and yet they still relied on race in their inference responses, saying that the attackers were likely Aboriginal or that the store owner likely had trouble understanding them because they were Aboriginal. This was, in other words, a laboratory case of the very dynamic that Nyhan identified: strongly held beliefs continued to influence judgment, despite correction attempts—even with a supposedly conscious awareness of what was happening.

In a follow-up, Lewandowsky presented a scenario that was similar to the original experiment, except now, the Aboriginal was a hero who disarmed the would-be robber. This time, it was students who had scored lowest in racial prejudice who persisted in their reliance on false information, in spite of any attempt at correction. In their subsequent recollections, they mentioned race more frequently, and incorrectly, even though they knew that piece of information had been retracted. False beliefs, it turns out, have little to do with one’s stated political affiliations and far more to do with self-identity: What kind of person am I, and what kind of person do I want to be? All ideologies are similarly affected.

It’s the realization that persistently false beliefs stem from issues closely tied to our conception of self that prompted Nyhan and his colleagues to look at less traditional methods of rectifying misinformation. Rather than correcting or augmenting facts, they decided to target people’s beliefs about themselves. In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior. For example, when women are asked to state their gender before taking a math or science test, they end up performing worse than if no such statement appears, conforming their behavior to societal beliefs about female math-and-science ability. To address this so-called stereotype threat, Steele proposes an exercise in self-affirmation: either write down or say aloud positive moments from your past that reaffirm your sense of self and are related to the threat in question. Steele’s research suggests that affirmation makes people far more resilient and high performing, be it on an S.A.T., an I.Q. test, or at a book-club meeting.

Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.

Still, as Nyhan is the first to admit, it’s hardly a solution that can be applied easily outside the lab. “People don’t just go around writing essays about a time they felt good about themselves,” he said. And who knows how long the effect lasts—it’s not as though we often think good thoughts and then go on to debate climate change.

But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted. Instead, why not focus on presenting issues in a way keeps broader notions out of it—messages that are not political, not ideological, not in any way a reflection of who you are?

< Full Story >

 

burymecloser

Member
Joined
Jan 31, 2010
Messages
516
MBTI Type
INTP
Enneagram
6w5
A few of the examples cited I'd heard about before, but it's surprising - and upsetting - how far this goes.

The overall conclusion is clear, but I'd be interested to know how many people did change their attitudes in the face of factual evidence to the contrary. Reading this, I assumed I would be open to new information, but I guess we can't know. Would be curious if there's any Fi/Ti disparity at play.
 

infinite

New member
Joined
Mar 19, 2014
Messages
565
MBTI Type
ISTP
Enneagram
~8
Instinctual Variant
sx/sp
A few of the examples cited I'd heard about before, but it's surprising - and upsetting - how far this goes.

The overall conclusion is clear, but I'd be interested to know how many people did change their attitudes in the face of factual evidence to the contrary. Reading this, I assumed I would be open to new information, but I guess we can't know. Would be curious if there's any Fi/Ti disparity at play.

Yeah pretty upsetting.

As for Fi/Ti blahblah here's a datapoint from me. It actually happened before that I was presented with factual information to the contrary. By factual evidence I don't mean someone else's personal views, but hard data, e.g. numbers of actual measurements of stuff. I could not choose to ignore any of it. It just won't go away from my consciousness. It does have a profound effect on my opinion when I'm presented with such things.

When it's subjective opinion, interpretation blahblahblah, it's not hard evidence so it takes much more to change my opinion. Some stuff e.g. pointing out an inconsistency - actual one, not just wordplay - does still help in these cases.
 

Beorn

Permabanned
Joined
Dec 10, 2008
Messages
5,005
I have to say that I was immediately put off by the title of the article. It strikes me as a bit condescending and arrogant to presume that people are somehow willfully ignorant because they disagree on some issues. I don't doubt that there are people who are willfully ignorant and can't be reasoned with, but I don't believe the examples given necessitate that. Moreover, the fact that this guy was engaged in trying to establish a political fact checking group turns me off as well. There is no place where the truth is in between the lines more than in politics. There may be hard facts involved, but even simply stating facts betrays bias. What facts matter? How does this fact fit into how I assess a candidate, party, or policy? How does this fact compare to that fact? The reality is that people who want to seen as dispensers of truth and facts are seeking a power that will enable them to manipulate and control what people think.

"But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted."

I think this statement makes it most clear that the author desires to just deposit knowledge in people's brains. That's just not how knowledge works and that's a good thing. I don't want people to be robots and i don't want people to unthinkingly assent to whatever someone in a lab coat tells them.

I do believe in absolute truth and make a big deal out of that often here. However, the existence of absolute truth and access to that truth are two separate issues. The author seems to think that scientific facts handed down by the CDC are self evident which I think is hardly the case. I believe that the reality is that the best way to access truth is through indirect communication. You can see this in the way that Jesus relied on parables and the way that Plato and Socrates relied on dialectical conversation. Indirect communication actually involves the participant in an active way. It immediately engenders trust in the participant as it's success is reliant on the participant's own reasoning ability. Direct communication is in many ways inferior. It immediately creates a dichotomy between the authority and the recipient who is expected to passively accept the knowledge. Indeed every direct communication begins with an appeal to authority.

That is what the author misses when he's examining the results of the study on anti-vaxers. There was a common denominator between all the efforts to persuade. Maybe some rejected each method because they were just hard-headed, but my guess is that their minds turned off as soon as they realized the information was provided by the CDC. They rejected the authority of the CDC. Is that reasonable? Probably not. I think it's obvious that this stems from a broader distrust in government. So whereas these guys seem obsessed with figuring out a way to deposit knowledge in people's minds I'm more concerned with creating a culture where people are encouraged to think critically and fixing a very untrustworthy government.

Finally, I don't give a shit what the CDC says about raw milk. It's delicious and well worth whatever risk I'm taking on and the same goes for over easy eggs and rare steak.
 

burymecloser

Member
Joined
Jan 31, 2010
Messages
516
MBTI Type
INTP
Enneagram
6w5
I have to say that I was immediately put off by the title of the article. It strikes me as a bit condescending and arrogant to presume that people are somehow willfully ignorant because they disagree on some issues.
This isn't disagreement on matters of opinion; people were explicitly shown that 2+2=4, and they chose to believe that 2+2=5. They preferred to be wrong than to change their beliefs.

Beorn said:
That is what the author misses when he's examining the results of the study on anti-vaxers. There was a common denominator between all the efforts to persuade. Maybe some rejected each method because they were just hard-headed, but my guess is that their minds turned off as soon as they realized the information was provided by the CDC.
They surveyed 2,000 people. Let's be real; most of those people had never heard of the CDC. And those surveyed didn't just ignore the information presented to them, they doubled down on what they already believed:
It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines.

And the vaccine issue was not an isolated example; it's representative of a broader pattern. You may have missed the point of the article.
 
W

WhoCares

Guest
So the Law Of Attraction is real. Thoughts do create our reality, just not in the
way New ageists believe.
 

Tellenbach

in dreamland
Joined
Oct 27, 2013
Messages
6,088
MBTI Type
ISTJ
Enneagram
6w5
a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism

1. One can't discount the power of the anecdote. Anecdotes are considered valid evidence by many people including myself.
2. A scientific study finding no correlation between vaccines and autism at a 0.05 confidence interval may find significance at a 0.10 confidence interval.
3. Scientific studies frequently come up with contradictory results. One week, coffee is good for you and the next week it's not.
4. Government lies to us frequently and maintaining a healthy level of skepticism is a good thing.

I also don't trust lefty academics to define what an "incorrect view" or a "false belief" is.
 

á´…eparted

passages
Joined
Jan 25, 2014
Messages
8,265
1. One can't discount the power of the anecdote. Anecdotes are considered valid evidence by many people including myself.
2. A scientific study finding no correlation between vaccines and autism at a 0.05 confidence interval may find significance at a 0.10 confidence interval.
3. Scientific studies frequently come up with contradictory results. One week, coffee is good for you and the next week it's not.
4. Government lies to us frequently and maintaining a healthy level of skepticism is a good thing.

I also don't trust lefty academics to define what an "incorrect view" or a "false belief" is.

So essentially, you trust pretty much nothing? I have the same issue with my mother and ask her the same question; what do you trust?
 

Tellenbach

in dreamland
Joined
Oct 27, 2013
Messages
6,088
MBTI Type
ISTJ
Enneagram
6w5
[MENTION=20829]Hard[/MENTION] I trust in my judgement, which is impeccable. I trust in results. If a doctor tells me he's treated 100 schizophrenics successfully using niacin, that would have an impact, especially if I learn that the patients have corroborated that story. I trust in Amazon product reviews and yelp restaurant reviews. Sure you get burned once in a while but for the most part, they haven't failed me. I trust in my ability to follow the logic from policy to outcome. I can usually tell what the outcome of a policy decision will be because I understand human nature.
 

á´…eparted

passages
Joined
Jan 25, 2014
Messages
8,265
[MENTION=20829]Hard[/MENTION] I trust in my judgement, which is impeccable. I trust in results. If a doctor tells me he's treated 100 schizophrenics successfully using niacin, that would have an impact, especially if I learn that the patients have corroborated that story. I trust in Amazon product reviews and yelp restaurant reviews. Sure you get burned once in a while but for the most part, they haven't failed me. I trust in my ability to follow the logic from policy to outcome. I can usually tell what the outcome of a policy decision will be because I understand human nature.

That explains a lot. While I use some of that. The big difference is I defer judgement to larger groups, less so on individual opinions (unless that individual has clout).

I find it interesting when others trust themselves and their judgement so implicitly, I certainly don't. A lot of that comes from being 1w2. I have to cross check it externally to see if it checks out. If I am overruled I let it go, even if I have to do it kicking and screaming. If there are potential consequences to my actions, I almost never rely on myself alone.

I definitely do the same thing as well with amazon and yelp. I am really picky with it too. I've had friends get angry at me before because I refused resturants due to what I felt was too bad of a review. They thought it was worth overlooking. It can be paralyzing though. There's been a lot of products I haven't bought because one or two valid negative reviews scared me off.

I don't trust anecdotes most of the time though. Partly because of the experience with my mother, that's pretty much all she does, and she is not capable of using critical thinking. When you have to force someone to accept that sea salt does not contain less sodium, you know there's a problem. I really really want her to stop wasting money at her "doctor" who isn't even a real medical doctor, but there's no reasoning with her.
 

burymecloser

Member
Joined
Jan 31, 2010
Messages
516
MBTI Type
INTP
Enneagram
6w5
Ok, now I wish the article hadn't led with the vaccine bit ... it immediately loses a portion of the audience who are emotionally involved by that issue (and simultaneously reinforces the author's point) and it sidetracks the larger issue, even though further examples are provided.

I think you missed the point of my argument.
What a very clever response!
 

infinite

New member
Joined
Mar 19, 2014
Messages
565
MBTI Type
ISTP
Enneagram
~8
Instinctual Variant
sx/sp
Moreover, the fact that this guy was engaged in trying to establish a political fact checking group turns me off as well.

Are you trying to say that it's not a good idea to check for *all* facts when assessing political claims?


There is no place where the truth is in between the lines more than in politics. There may be hard facts involved, but even simply stating facts betrays bias. What facts matter? How does this fact fit into how I assess a candidate, party, or policy? How does this fact compare to that fact?

The answer is really simple. Check all facts, including ones that seem to go against your current opinion. Then see how all these facts work together. Yes sure it's a fucking complex process. Though some facts do speak for themselves.


The reality is that people who want to seen as dispensers of truth and facts are seeking a power that will enable them to manipulate and control what people think.

Heh, your interpretation. I just see the article's writer as someone who'd like people to be willing to evaluate all facts before mindlessly picking one opinion or mindlessly sticking with it.


"But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted."

I think this statement makes it most clear that the author desires to just deposit knowledge in people's brains. That's just not how knowledge works and that's a good thing. I don't want people to be robots and i don't want people to unthinkingly assent to whatever someone in a lab coat tells them.

This is still interesting, how you managed to read all that stuff into these lines. Nowhere was it stated that people should just accept everything "someone in a lab coat tells them".

If you continue to read the article, you can see it explicitly talks about wanting to find strategies to help people think more broad-mindedly and also help people assess facts without mixing it all with self-affirmation issues. I would say this is a rather good and useful goal. Do you not agree with that goal then?


I do believe in absolute truth and make a big deal out of that often here. However, the existence of absolute truth and access to that truth are two separate issues. The author seems to think that scientific facts handed down by the CDC are self evident which I think is hardly the case. I believe that the reality is that the best way to access truth is through indirect communication. You can see this in the way that Jesus relied on parables and the way that Plato and Socrates relied on dialectical conversation. Indirect communication actually involves the participant in an active way. It immediately engenders trust in the participant as it's success is reliant on the participant's own reasoning ability. Direct communication is in many ways inferior. It immediately creates a dichotomy between the authority and the recipient who is expected to passively accept the knowledge. Indeed every direct communication begins with an appeal to authority.

Lol you're so weird. Just because someone talks to you in a direct way, it doesn't mean you're supposed to passively accept what they say. Nah. If you have a problem with authority, then just ignore this perceived and I would say, imagined, "appeal to authority".


That is what the author misses when he's examining the results of the study on anti-vaxers. There was a common denominator between all the efforts to persuade. Maybe some rejected each method because they were just hard-headed, but my guess is that their minds turned off as soon as they realized the information was provided by the CDC. They rejected the authority of the CDC. Is that reasonable? Probably not. I think it's obvious that this stems from a broader distrust in government. So whereas these guys seem obsessed with figuring out a way to deposit knowledge in people's minds I'm more concerned with creating a culture where people are encouraged to think critically and fixing a very untrustworthy government.

Lol, critical thinking involves avoiding knee-jerk reactions based on what the source of a piece of information is (e.g. CDC).

I think the article here is actually about encouraging critical thinking.


Finally, I don't give a shit what the CDC says about raw milk. It's delicious and well worth whatever risk I'm taking on and the same goes for over easy eggs and rare steak.

That's all cool if you assessed risks and your priorities and chose based on that.


1. One can't discount the power of the anecdote. Anecdotes are considered valid evidence by many people including myself.
2. A scientific study finding no correlation between vaccines and autism at a 0.05 confidence interval may find significance at a 0.10 confidence interval.
3. Scientific studies frequently come up with contradictory results. One week, coffee is good for you and the next week it's not.
4. Government lies to us frequently and maintaining a healthy level of skepticism is a good thing.

I also don't trust lefty academics to define what an "incorrect view" or a "false belief" is.

1. Anecdotes are only circumstantial evidence. Sometimes they do tell you something, sometimes not.
2. If you find significance at 0.10, you can repeat the experiment and see what happens then. And, of course, when a study says significant or no significant correlation at 0.05, you'll still want to see if other studies have been done and if they had the same result... and ideally, studies that were just shoved in a drawer because of someone not liking the results. Quite honestly though, even when it's 0.05, it's just a probability, you'll still need to explain the stuff properly. 0.05 isn't all that great IMO. Maybe at 0.0000001, or something like that, it's okay to just take it as fact without further explanation, though even then it would be great to have a real explanation.
3. That's because in some kinds of sciences, such as with the topic of nutrition with the coffee, it's hard to control for all variables in one single study. Actually, impossible. Again, it's much better to have a real explanation beyond one study finding a correlation of a certain strength. It's just hard to do in these kinds of sciences. Will take a while to get there.
4. Sure, critical thinking is good. Regardless of whether the govt has the intention of lying or they are just simply wrong about something.


That explains a lot. While I use some of that. The big difference is I defer judgement to larger groups, less so on individual opinions (unless that individual has clout).

God is that some Fe thing?! Why does it matter, when evaluating an opinion, if an individual has some kind of influence? That's really just silly. I prefer to use my logic and facts; regardless of source, logic is logic and fact is fact. (...ok that wording is not the best. I'm referring to hard facts, e.g. with scientific studies if they are repeatable that's good, then it's definitely not a false claim by one researcher faking the results.)


I find it interesting when others trust themselves and their judgement so implicitly, I certainly don't. A lot of that comes from being 1w2. I have to cross check it externally to see if it checks out. If I am overruled I let it go, even if I have to do it kicking and screaming. If there are potential consequences to my actions, I almost never rely on myself alone.

You're weird too o_O

Or, I find it interesting you don't trust your own judgment. Yeah. Heh.

(I'm not trying to be offensive with the laugh. It's just funny how you said "interesting")


I definitely do the same thing as well with amazon and yelp. I am really picky with it too. I've had friends get angry at me before because I refused resturants due to what I felt was too bad of a review. They thought it was worth overlooking. It can be paralyzing though. There's been a lot of products I haven't bought because one or two valid negative reviews scared me off.

Tbh quite a few reviews come from people who either have different preferences from yours - thus their experiences will not apply to you -, or they simply don't know what they're talking about. But yes, it can be some nice data reading/hearing about experiences of others.


I don't trust anecdotes most of the time though.

Hey! Agreement! And your mother must be a pain in the ass, I feel for you.. :D
 

á´…eparted

passages
Joined
Jan 25, 2014
Messages
8,265
God is that some Fe thing?! Why does it matter, when evaluating an opinion, if an individual has some kind of influence? That's really just silly. I prefer to use my logic and facts; regardless of source, logic is logic and fact is fact. (...ok that wording is not the best. I'm referring to hard facts, e.g. with scientific studies if they are repeatable that's good, then it's definitely not a false claim by one researcher faking the results.)

You're weird too o_O

Or, I find it interesting you don't trust your own judgment. Yeah. Heh.

(I'm not trying to be offensive with the laugh)

Tbh a lot of reviews come from people who either have different preferences from yours, or they simply don't know what they're talking about. But yes, it can be some nice data reading/hearing about experiences of others.

Hey! Agreement! And your mother must be a pain in the ass, I feel for you.. :D

I guess it is Fe. The way I look at it, is if a person has a reputation, the are an expert in their field/area, have proven to be competent and right over time, then they deserve to be listened to and take seriously. Sure if they err then it needs to be corrected, but I see it as a way to condense information. I trust people easily (it's implicit for me actually; I don't need to think about it) so that comes with it. Logic indeed is logic, but I take into consideration who is giving it. It can have an impact on the information they deliver. But yeah when it comes to hard factual data, that "check" need starts to go down a lot. But like, if the head of focus on the family released a study on families, I'd be like "yeah... that got released it? HA!" He's got a history, and it impacts it.

Well I trust my own judgement to a certain extent, perhaps more than I present myself. Still, I find external things less likely to err, so checking external to myself reduces error. I mean, I check for what the person is, or what the review is. If it becomes apparent their judgement was poor, or had different issues/concerns with what I had then I can write it off.
 
S

Sniffles

Guest
Is there a link to the actual study? I'd be interested in reading this.
 

wildflower

New member
Joined
Jul 8, 2011
Messages
317
admittedly i only read the first two paragraphs but it sounds like they were trying to change people's entrenched beliefs through junk mail. too funny.
 

infinite

New member
Joined
Mar 19, 2014
Messages
565
MBTI Type
ISTP
Enneagram
~8
Instinctual Variant
sx/sp
I guess it is Fe. The way I look at it, is if a person has a reputation, the are an expert in their field/area, have proven to be competent and right over time, then they deserve to be listened to and take seriously.

Yeah, it's alright with the listening part. It's probably a good idea to hear what a person with expertise has to say. I still process that data the same way I would other data, though.


Sure if they err then it needs to be corrected, but I see it as a way to condense information. I trust people easily (it's implicit for me actually; I don't need to think about it) so that comes with it. Logic indeed is logic, but I take into consideration who is giving it. It can have an impact on the information they deliver. But yeah when it comes to hard factual data, that "check" need starts to go down a lot. But like, if the head of focus on the family released a study on families, I'd be like "yeah... that got released it? HA!" He's got a history, and it impacts it.

Of course, good idea to take into account possible biases.


Well I trust my own judgement to a certain extent, perhaps more than I present myself.

That would make sense :)
 
S

Society

Guest
[MENTION=5789]Beorn[/MENTION], your point might be applicable to a few of the examples given (that is, they compared it to what the study conductor considered to be factual), but if you'll notice in the robbery scenario they did test both sides of the isle to keep to their previous misconceptions (and they did), and in that case we're talking about corrections made by the same source that delivered the story in the first place. politics provided a groundwork in which it was easy to find people who have a personal stake on similar topics (in contrast to personal matters which would be harder to test for commonalities), but the point isn't political, it's the degree to which people will revert back to their personal stance in face of contradicting information.

[MENTION=21718]infinite[/MENTION] - somewhat off topic, but since Ti/Fi almost never deviates in surveys from the 50/50% mark, for it to have made a significant difference would have dramatically altered the results of the study from an overwhelming majority of similar reactions to a dichotomy of different reactions. as it stands, this doesn't seem to be the case, but it's hard to tell without being able to look at the actual studies.

what i am most interested in is the claim that self-affirmation drills counter this affect are claimed to be supported but not shown how, which is a shame, i find it the most interesting part of the article. if i am getting it right, the idea is "don't panic! your identity is safe! now look at the new information..", but wouldn't that just reaffirm the very information they hold to in contrast to the new information? would people be more resilient to the cognitive dissonance or will it dissolve over time? also, what differentiates those who modify the behavior in response to criticism from those who don't? the potential applications here are huge, this shouldn't be glossed over.
 

infinite

New member
Joined
Mar 19, 2014
Messages
565
MBTI Type
ISTP
Enneagram
~8
Instinctual Variant
sx/sp
[MENTION=21718]infinite[/MENTION] - somewhat off topic, but since Ti/Fi almost never deviates in surveys from the 50/50% mark, for it to have made a significant difference would have dramatically altered the results of the study from an overwhelming majority of similar reactions to a dichotomy of different reactions. as it stands, this doesn't seem to be the case, but it's hard to tell without being able to look at the actual studies.

How do you think Ti/Fi would have been different from Te/Fe? I'm not sure I understand you there.


what i am most interested in is the claim that self-affirmation drills counter this affect are claimed to be supported but not shown how, which is a shame, i find it the most interesting part of the article. if i am getting it right, the idea is "don't panic! your identity is safe! now look at the new information..", but wouldn't that just reaffirm the very information they hold to in contrast to the new information?

As to the bolded: Apparently not. :) It makes sense to me.


would people be more resilient to the cognitive dissonance or will it dissolve over time? also, what differentiates those who modify the behavior in response to criticism from those who don't? the potential applications here are huge, this shouldn't be glossed over.

Where do we bring "criticism" into the picture? I thought this was just about presenting facts (and interpreting them).


I'll go on a limb and propose that maybe you'd like to read this:

The best way to win an argument « Mind Hacks

Nice article. So am I alone in knowing very well that I don't know a lot about certain topics? I'm usually painfully aware of where my current understanding ends. The article gave examples about flushing toilets, car speedometers and sewing machines... yeah I have no idea how any of those really work and I would have willingly admitted that in the study. Honestly I've never really checked out these sorts of things, though I sometimes get curious for a second or two about how certain everyday things work "under the hood". But then I just forget to go look it up or something... Though if the study had said "computers", I would have responded differently :D
 
Top