What’s up with the hating on science lately? I’ve been noticing a number of people talking about science with disdain and it’s really getting under my skin. It’s as if there’s something superior about subscribing to unfounded treatments. And wanting scientific proof of the efficacy of a treatment is a lack of faith. When did healthcare become a religion? Why would someone say, “Oh, dietitians follow science” with a tone as if we’re a bunch of unenlightened atheists while they follow the true word of their holistic god. I’m open to new developments and if you can prove to me that there is some benefit (beyond a placebo, although admittedly placebos can be pretty powerful) to consuming whatever extract or supplement you’re extolling the virtues of then I’ll gladly change my tune. But is it really so wrong of me to want proof? Why should I blindly throw my money and support behind unproven remedies? And why can’t this dialogue go both ways? I watched those Food Matters documentaries. I want to hear all sides of a story. It baffles and frustrates me that so many people buy into this sort of thing. Not only without bothering to check out the validity of claims being made but wantonly ignoring evidence that goes against their viewpoints. I’m sorry but there is nothing virtuous about putting blind faith in unproven remedies and spurning science.