(To listen to this podcast segment, go to Skeptical Connections.)
I’ve been thinking a lot about the role of the rank and file skeptics, us regular, non-pro skeptical schlubs. How do we spend our skeptical time?
Here’s a question: If we make a generalization that skepticism was refined in the late 20th century by Sagan, Randi, Nickell, et al, what has changed since then? (Don’t just say ‘the internet.’)
Of the no doubt many answers to that question, I’m going to focus on just one, which I’ll generalize as the psychology of belief. In the last couple of decades we’ve seen tons of research into the why and how people believe what they believe. And how people change their minds. And how people, well, kinda rarely actually change their minds.
Without doing a dull layman’s recap of all this research, let’s stick to the backfire effect. Essentially, the backfire effect is that, when we believe something, being confronted with evidence that contradicts that belief actually strengthens it, rather than weakens it. 
Understandably there’s been a fair amount of discussion among skeptics as this and similar findings have appeared, developed, been replicated, and as far as I can tell, become consensus.
So should this change our approach? I’m not going to pretend I have the answer, but I think it does give credibility to something a lot of us do in our day to day lives as skeptics: stepping on flames to try and stop the larger fires.
Posting a Snopes link, pointing out opposing arguments, even just saying “hmm, I’m not so sure about that.” These little things can barely be called activism, but my guess is they do slow down wacky ideas, keep them from becoming quickly entrenched. If a new woo-meme’s appearance is accompanied by some good old ‘Um, Uh, Yeah that sounds fishy…’ maybe we can keep it from fossilizing to the point where the backfire effect and its friends reign supreme.
What’s new today is woo tomorrow. Likewise, today’s incomplete science or downright bad science is tomorrow’s pseudoscience. Fighting back against poor reporting, poor press releases, conclusion-jumping, unreplicated results… While we can’t always stem the tide of false belief, we can slow down its progress, get people to think twice before incorporating preliminary findings into their worldview.
Which brings me back to traditional skepticism and new skepticism. The traditional tool kit is great, but the psychological evidence that’s accumulated suggests that doing what we can to stop false ideas from taking root in the first place is important work too. Just being the doubting voice in a conversation has value.
When it comes to the cultural incorporation of new beliefs, one of the most important messages for the rank and file skeptics to spread is a simple one, a classic: slow down.
 Great article on the backfire effect and related biases. http://youarenotsosmart.com/2011/06/10/the-backfire-effect/