
Back (finally) to reply to everyone's posts. Sorry about the delay... I was working on 4 hours of sleep and 12 hours of work when I logged on yesterday.
Also, although I get very emotional about the subject matter, I'm really glad that it's coming up and that I can read everyone's views on it (even if we might disagree).
Teaching simplified science, although it may not do justice to the entire process, is about the best we can do at a level below university. There is too broad a spectrum to learn; even as it is, kids have to take at least a year of chemistry, biology, and physics, with an extra afterwards; naturally, one could not understand the precise details of a field in that time. It is simplified so that they can get a general view point that they can expand later if they want to follow up with it; if not, then it is enough education to get them through the world with enough understanding to function.
Should they teach skepticism? Perhaps, but I'm not sure how useful it would be to the general public. Most people do not need to know the exact reason why the solution is green; the principle is usually as much they need. A non-scientist doesn't need to understand the major context behind an event; they need to know how to use it. That does not require as much skepticism.
I understand that. My issue with teaching simplified science is that it is taught as "fact", as a narrative. That's the main problem, in my mind. I agree that as a practical matter we have to teach simple concepts and work our way up (similar to learning), but to teach the simple concepts as if they are "actual reality" is what is happening in schools. And the same goes for policy-makers who have only a rudimentary understanding of what is going on.
As far as the general public not "needing" skepticism to understand science, I beg to disagree. This is the general public who believe that "mixing human DNA with animal/bacterial DNA" is "wrong". This is the general public that believes that working with ES cells violates Gods' will. This is the general public that believes that a university is discharging radioactive waste into lakes because they glow green (actually, the university was discharging a fluorescent organic molecule). This is also the general public that panics when the CERN particle accelerator is switched on.
I may oppose ES cell research, but at least it's because I've worked in a mouse ES cell lab and have seen the things that are done to the animals. The problem with working with "pure" concepts that are "approximations" of reality, without any real understanding of context is that you think you know what's going on - but you don't really know. As opposed to admitting that you know nothing about it, and starting with a blank slate and allowing people who have the experience to make the decisions.
I'll agree that scientists do not have to understand all of the other fields they are collaborating with. However, they do have to have a basic understanding of what the other field does and how that ties into their own area of knowledge. And the scientific method, although it does not always progress science, does allow for a pattern of organization between these fields and across continents. Anymore, when scientists are trying to share their discoveries with others that speak different languages and have different customs, they need to have some sort of layout to follow, especially in a way that allows for recreation of experiments. If something can not be recreated and, therefore, proven or disproven, then it becomes very debatable about whether or not it is credible. Therefore, it gives method to the madness, and it is definitely arguable that in contemporary science it is necessary.
Yes. Unfortunately, the basic understanding does not extend to limitations in technique and controls. I've already said that "the scientific method" does not apply to reality... Assumptions that scientists "use the scientific method" promotes faith in the findings, yes. But it doesn't cohere with reality.
I'd say that when scientists share their findings with other scientists, the basis of comparison isn't so much "the scientific method", it's shared context. If the context isn't the same, it doesn't matter how many controls you do - the understanding on the other side of the telephone will be completely different.
That is not to say that it is necessary to science for the sake of science. If I am not mistaken, the scientific method wasn't even developed until Galileo's time, and there have been many progressive scientific discoveries before him.
But the scientific method, now, helps people to understand the way science is proven/disproven in modern times, and it is a good start in understand what science itself encompasses.
And I am aware that others were off topic; you just seemed rather forcefully so. But you're making some very good points now, and I respect your opinion
Actually, definition of "the scientific method" only appeared with Robert Boyle, not Galileo. And historians of science have also shown, using prominent examples like the Copernican Revolution, that science does not "progress" via "the scientific method". That's why I'm opposed to limiting people's understanding of "science" to its supposed method. The actuality is different, and I can only think that the reason why people think that falsificationism plays such a large role in science is basically to build faith so that scientists can obtain funding. Perhaps that's the cynic in me speaking, but it seems to me like the labs that have the most funding aren't necessarily those doing the most crucial/achievable science. It's those who are best able to "sell" an idea.
It's probably my idealism, but I hope that one day, people will be able to understand the connections between different science groups, corporations, politics and funding. From there, they will be able to make a well-reasoned personal judgment as to how their taxpayer money could be better spent.
Yeah; the issue is figuring out how. It's very easy to point out areas that have problems; the hard part is changing the makeup of society to fix it :/
Skepticism is good when routed in the right direction; skepticism towards everything can lead to a lot of unhealthy consequences. I agree there should be more skepticism in science, but if skepticism is to be taught, it needs to be taught in a way that people can understand where and how to be properly skeptical...
Yes, the problem is "how". My ideas involve a revamp of education, starting with primary school. Instead of teaching the scientific concepts based on "fact", as a nice narrative, I'd advocate teaching what the concepts are, their historical and scientific context and how they relate to other disciplines.
I'll give an example here, of Darwin's theory of evolution. Currently in a "normal" science class, it would be taught as: 7 steps leading to speciation, genetic drift, natural selection, formation of reproductive isolation mechanisms etc etc. My HS science teacher specified at the start of the class that she was not going to talk about its sociological context, or religious implications, because she was there to help us PASS AN EXAM, not teach us "truth".
In my science class, I'd talk about what was taught to me, in addition to religious context, how religions are attempting to integrate their history and teachings with evolution in the context of intelligent design, previous ideas about how speciation arose, the WRONG examples in the textbooks (yes, there are, the peppered moth being a prominent example), the right examples in the textbooks, differences between micro/macroevolution and their individual contexts, reactions to the theory back in Darwin's time and now, and several other things. It probably wouldn't be very practical in terms of time spent in the classroom, but I think it's a whole lot more useful with regards to understanding the world.
I also have issues with this attitude that "skepticism towards everything is bad" and there being a method of being "properly skeptical". I am a fan of being a skeptic, and deciding (even with skepticism) what you're going to believe. I don't think it's possible to live a life that's completely logical and well-reasoned, and that assuming that everyone is logical etc has reduced the potential for progress.
First off, I appologize if I offended you--I was a little surprized you only quoted just the last part of my post. For some reason, I got the impression that you were just a year or two out of college. It is not meant as an insult, but I knew people in technical fields who said similar things as RCGs but changed after working for many years. But that was not meant to change anybody's mind, and it is certainly possible people believe that long retrainings are a necessity even after seeing many others make career transitions without needing them.
Oh no, you didn't offend me. It's just that I decided that you and gloomy-optimist were essentially bringing the same point there and I wanted to address similar points together. Then I um, slept for 16hrs last night

and didn't get around to addressing the rest of your post. Sorry.
You're completely right, I'm only two years out of college this Dec. I may change my mind later on, I reserve the right to do so.

But right now I'm arguing from my perspective, which is based on observations of the people around me (and myself).
In response to the ideas that "simplifed" science is harmful, and the idea that even scientists don't know what is happening in science, there are a few points I wanted to make (hopefully, they are self-evident):
1) First and most importantly, it is not an all-or-nothing matter. There are different levels of sumarization. One does not have to have run the experiments (or even similar experiments) to make sense of the results. Certainly, the more closely you've worked to a particular line of research and development the deeper your understanding will be.
Yes, I'm aware of that. I am also aware that ego often plays a part in determining how much you "actually know" and how much you "think you know". My attitude is that it's a lot better to assume that you know absolutely nothing than to think that the simplified science is all that there is.
2) Second, there is a vast difference between an accurate "simplificaton" (sumarization) and misinformation.
We will have to disagree here, because accuracy is a value judgment and so is "misinformation".
3) The general public ought to know general science, for the same reasons they ought to know how to read, and how to do basic calculations. Note, I am not saying people won't survive if not scientifically literate. But like general literacy, it will elevate the level and content of discourse when scientific literacy is nearly universal.
I disagree. There is no "ought". The same argument could be made about religion, or literature, or art.
4) There may be no "scientific cannon" and no real "scientific method" to learn. But skepticism alone does not make someone scientifically literate. There are still rather well established concepts in science (which are of course apporximate) that a sceintifically educated people knows. You can call it "faith" if you want, but I would prefer that the general public have faith in these approxomate truths to nonsense or magical thiking, or simply held onto their myths while being irrationally skeptical of science.
Again, I disagree. And it comes down to what is "true understanding" and "approximate truths" and "lies", doesn't it? I've never said that skepticism makes someone scientifically literate. But I believe that self-awareness and skepticism bring people closer to truth. If you know what you believe, why you believe what you believe (in spite of the assumptions) and you know why you disbelieve what you disbelieve (in spite of the assumptions), I think you'll have a pretty good understanding of both yourself and "truth".
(which follows with the next part)
I am purposely gong to pick gross approximations.
The Earth is kind of a sphere. Opposite poles of a magnet attract. F=ma. Atoms make up the matter around us. Germs can make us sick. We can control the features of animals through breeding.
These are all "general science" concepts that lay-people learn. But if they were not taught, what would people go around thinking?
The earth is flat, just look at it.
There are "magic" attraction forces, I don't believe in your poles.
Force just imparts a bit of movement, it is not proportional to acceleration-look *pushes object*.
There is no proof that atoms exist--look at this table--it is solid.
Hand-washing does nothing for health (not even for doctors)--I mean c'mon, invisible things that float around and make us sick?
This is an exageration, but people held these sorts of beliefs at one point and you will still find uneducated people who believe things like this.
I think that there is plenty of reason to be skeptical about some of the theories that get classified as "science", and that science, as it exists, is not wholly rational. I.e. skepticism w.r.t. science is not irrational. You're seeing this as a black/white picture, complete with prejudices about the "general public" (term was never defined, by the way).
I love this saying "Assuming that the orthodox is always the orthodox orthodoxy and that the unorthodox is always the orthodox unorthodoxy is a true mark of a dogmatic". I personally believe that while there undoubtedly are people who hang on to myths and are completely silly/irrational, most people are more sophisticated than that, and have other personal "reasons" (therefore not irrational) for believing what they believe.
Additionally, while science undoubtedly has brought us progress, I don't think the world would end with a few flat-earthists around (and if you look at the conspiracy theorists' "evidence" for flat earth, they are not irrational at all! If anything, they are hyperrational and skeptical of all evidence!). I guess my point is - it doesn't matter if people learn the "gross approximations" or not. It doesn't count as "true understanding", they're not working on the subject material, and it's not going to change anything in the big picture.
Imagine, in addition, if they voted against reseach on germs because they thought it similar to believing in fairies?
HAH. Funny that you should say that, because I have a story told to me by my 2nd year chem lecturer. Basically, the Aussie govt provided a huge amount of funding for this research program to purify some metal (can't remember what it is) by coupling it electrolytically to another metal... Which, any 2nd year chem student would be able to tell, is impossible because of the difference in oxidative potential. This illustrates how govt understood the principle in general, but lacked understanding of the subtleties behind the chemistry... and gave money to an endeavor that 100% would fail.
Additionally, Nobel prize laureate Kary Mullis, who (arguably) invented PCR doesn't believe that HIV causes AIDS and that CFCs cause ozone depletion. So... Orthodox orthodoxy and orthodox unorthodoxy etc. "Common sense" is not a part of rationalism.
It's good that we have a concrete example that you know well. Because I belive my reasoning is general enough to work on any example, and as long as you are honest, I think I can persuade you that it works on your own example, too.
To illustrate different levels of accurate simplification.... Despite saying that chemists know nothing about proteins, I think you will have to admit that they know enough about them to be working on the project. Also, despite not being a protein scientist I and also many science enthusiasts who read nature, lifescientist, or other science magazines know that quantum dots can be used to mark proteins as a means locate or identify molecules that contain that protein. I don't know what you mean by "linking" but I have some base-line for trying to understand what you are doing. What are you doing , BTW?
I would say that the chemists don't care about the protein that we're trying to link to the QD. They just want a "proof of concept" that it can be done chemically, with reasonably high yield. Just for background, QDs are a mix of heavy metal nanoparticles that fluoresce stably and can't be photobleached. QDs can only be used to "mark" proteins fluorescently if they have been "linked" covalently to an antibody that can recognise the protein of interest. Otherwise, QDs are a fucking bitch (excuse the language) to work with, unstable in even the most common solutions, and don't
do anything beyond fluorescing. I've also shown that the fluorescence can definitely be quenched. That, of course, wouldn't be documented in any magazine (hence the lies to children in popular science publications) because it's not in our interests to actually say such things.
I have been working on trying to form a specific covalent bond between QDs and an amyloid-forming protein via a maleimide cross-linker. I have shown over the last few months that the QDs aren't stable in the solution environment that the proteins like, the proteins don't cooperate in the solutions that the QDs are stable in, and that when I've finally found a happy medium, the bonds are not specific and the protein basically binds everywhere on the QD, and washes off the organic molecules that keep it stable in solution.
My biochemist supervisor says that I'm concentrating too much on the chemistry part, that I should remember my focus remains on the protein. The chemist supervisor isn't interested in anything else beyond making the QDs stable and colourful, proteins are secondary. The nanoparticle supervisor wants me to embark on a whole different project with the QDs and proteins that would take years. It is annoying and counter-productive because there is no central vision for the project.
I find your blanket statement that "people don't know about other fields" to be false. The people working together have to know enough about what the other is doing to interface with each-other. In arenas of scientific collaboration, the base-line knowledge of someone scientifically trained (espeacially in a similar feild) will be vastly superior to a layperson. Do you believe this to be false in your own team? If it is false, has it been detrimental?
Also, who is going to read your publications? Your own team? What is the point of that? Don't you have people who are interested in your results? Who is funding you? What "field" does your money source have to be from to "understand" your results?
Ref. earlier explanation about the situation. They do know about what is happening at the interface, but tend to think that certain problems specific to the situation could be solved using typical blanket solutions (that you can't use because of controls in the other field). So that is what I meant. It is very detrimental, definitely. It also means that the project doesn't move forward.
Yes, the base-line knowledge of professionals are definitely higher than a layperson. But I think that it gets in the way of progress because at least the layperson would admit that they don't know anything. Many professionals think that because they know something and it's worked for them, it would work everywhere else. I guess it comes back to "better to think that you know nothing and be open-minded than to think that you know everything and be close-minded".
Honestly I don't know who would care enough read my paper (if a paper does eventually come of this). I personally don't see the significance of it, and many of my biochemist colleagues agree that it would have zero physiological relevance, so byebye funding. It's proof-of-concept and nothing more. I'm only doing it because I'm being paid for it, as a side-project to my other projects that definitely have a LOT more clinical relevance. And my supervisor is only paying me for it because he's a cheapskate (this is funded by an inter-departmental grant) and wants to keep me on to work on my other projects.
Wow. This is a lot longer than I thought it would be.