á´…eparted
passages
- Joined
- Jan 25, 2014
- Messages
- 8,265
While I realize that the thread I had made previously on a related matter is still active, it was made more than 3 weeks ago. I feel that this article deserves it's own thread. It's a long article, but I find it to be very well written, insightful, and worth the read.
Why Science is so Hard to Believe
Excerpts from the article, though the first one I think captures it all, and is very interesting:
I think this article captures the essence of the issue here, and it's that science denial stems from the availability of information to everyone. I sort of touched on this in [MENTION=4945]EJCC[/MENTION]'s blog earlier today, and she did as well, and it's idea of experts and laymen; who causes the problem, do they both, and what can be done to mitigate it. It seems like there isn't any offering of a solution here. Nevertheless, it shows what's going on, and that it's actually quite complex. I wonder what others here feel would be a good solution to managing this problem in this modern era with it's overflowing information availability.
On a personal level (and to be perfectly honest, I am not proud to admit this, but I feel I must for the purposes of the thread), I have experienced the effect of using science to reinforce my world views. For several years when I was around 16-20 (2005-2009), I was anti-fluoride, anti-vaccine, and partially anti-GMO. A lot came from influence from my mother, and I parroted it back. But I found I wanted those things to be true, so I found "evidence" (it wasn't really of course) to support it. It wasn't until I started to stay more alert and critical instead of starting from idealism that I was forced to admit that I was looking at the wrong evidence, and finding things to support my views. It wasn't fun, admitting I was doing it wrong, and was wrong. I still find myself wresting with this at times. I also still have an internal fear reaction whenever I get a vaccine, despite rationally knowing it's good. It's very important though for us to run against what our guts tell us with science when faced with credible evidence, because a lot of the time (as the article points out) science isn't intuitive, and even the deepest education of it can't prevent one from slipping.
Discuss.
Why Science is so Hard to Believe
Excerpts from the article, though the first one I think captures it all, and is very interesting:
excerpt said:The “science communication problem,†as it’s blandly called by the scientists who study it, has yielded abundant new research into how people decide what to believe — and why they so often don’t accept the expert consensus. It’s not that they can’t grasp it, according to Dan Kahan of Yale University. In one study he asked 1,540 Americans, a representative sample, to rate the threat of climate change on a scale of zero to 10. Then he correlated that with the subjects’ science literacy. He found that higher literacy was associated with stronger views — at both ends of the spectrum. Science literacy promoted polarization on climate, not consensus. According to Kahan, that’s because people tend to use scientific knowledge to reinforce their worldviews.
additional excerpts said:Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative.
...
In this bewildering world we have to decide what to believe and how to act on that. In principle, that’s what science is for. “Science is not a body of facts,†says geophysicist Marcia McNutt, who once headed the U.S. Geological Survey and is now editor of Science, the prestigious journal. “Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not.â€
...
Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They nest in our brains, chirping at us as we try to make sense of the world. ... Yet we have trouble digesting randomness; our brains crave pattern and meaning.
...
Even for scientists, the scientific method is a hard discipline. They, too, are vulnerable to confirmation bias — the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them.
...
The media would also have you believe that science is full of shocking discoveries made by lone geniuses. Not so. The (boring) truth is that science usually advances incrementally, through the steady accretion of data and insights gathered by many people over many years.
...
We believe in scientific ideas not because we have truly evaluated all the evidence but because we feel an affinity for the scientific community. When I mentioned to Kahan that I fully accept evolution, he said: “Believing in evolution is just a description about you. It’s not an account of how you reason.â€
I think this article captures the essence of the issue here, and it's that science denial stems from the availability of information to everyone. I sort of touched on this in [MENTION=4945]EJCC[/MENTION]'s blog earlier today, and she did as well, and it's idea of experts and laymen; who causes the problem, do they both, and what can be done to mitigate it. It seems like there isn't any offering of a solution here. Nevertheless, it shows what's going on, and that it's actually quite complex. I wonder what others here feel would be a good solution to managing this problem in this modern era with it's overflowing information availability.
On a personal level (and to be perfectly honest, I am not proud to admit this, but I feel I must for the purposes of the thread), I have experienced the effect of using science to reinforce my world views. For several years when I was around 16-20 (2005-2009), I was anti-fluoride, anti-vaccine, and partially anti-GMO. A lot came from influence from my mother, and I parroted it back. But I found I wanted those things to be true, so I found "evidence" (it wasn't really of course) to support it. It wasn't until I started to stay more alert and critical instead of starting from idealism that I was forced to admit that I was looking at the wrong evidence, and finding things to support my views. It wasn't fun, admitting I was doing it wrong, and was wrong. I still find myself wresting with this at times. I also still have an internal fear reaction whenever I get a vaccine, despite rationally knowing it's good. It's very important though for us to run against what our guts tell us with science when faced with credible evidence, because a lot of the time (as the article points out) science isn't intuitive, and even the deepest education of it can't prevent one from slipping.
Discuss.