SQUIDOCTO

Hello Squidocto: New Beliefs? Slow Down.

(To listen to this podcast segment, go to Skeptical Connections.)

I’ve been thinking a lot about the role of the rank and file skeptics, us regular, non-pro skeptical schlubs. How do we spend our skeptical time?

Here’s a question: If we make a generalization that skepticism was refined in the late 20th century by Sagan, Randi, Nickell, et al, what has changed since then? (Don’t just say ‘the internet.’)

Of the no doubt many answers to that question, I’m going to focus on just one, which I’ll generalize as the psychology of belief. In the last couple of decades we’ve seen tons of research into the why and how people believe what they believe. And how people change their minds. And how people, well, kinda rarely actually change their minds.

Without doing a dull layman’s recap of all this research, let’s stick to the backfire effect. Essentially, the backfire effect is that, when we believe something, being confronted with evidence that contradicts that belief actually strengthens it, rather than weakens it. [1]

Understandably there’s been a fair amount of discussion among skeptics as this and similar findings have appeared, developed, been replicated, and as far as I can tell, become consensus.

So should this change our approach? I’m not going to pretend I have the answer, but I think it does give credibility to something a lot of us do in our day to day lives as skeptics: stepping on flames to try and stop the larger fires.

Posting a Snopes link, pointing out opposing arguments, even just saying “hmm, I’m not so sure about that.” These little things can barely be called activism, but my guess is they do slow down wacky ideas, keep them from becoming quickly entrenched. If a new woo-meme’s appearance is accompanied by some good old ‘Um, Uh, Yeah that sounds fishy…’ maybe we can keep it from fossilizing to the point where the backfire effect and its friends reign supreme.

What’s new today is woo tomorrow. Likewise, today’s incomplete science or downright bad science is tomorrow’s pseudoscience. Fighting back against poor reporting, poor press releases, conclusion-jumping, unreplicated results… While we can’t always stem the tide of false belief, we can slow down its progress, get people to think twice before incorporating preliminary findings into their worldview.

Which brings me back to traditional skepticism and new skepticism. The traditional tool kit is great, but the psychological evidence that’s accumulated suggests that doing what we can to stop false ideas from taking root in the first place is important work too. Just being the doubting voice in a conversation has value.

When it comes to the cultural incorporation of new beliefs, one of the most important messages for the rank and file skeptics to spread is a simple one, a classic: slow down.

——————————

[1] Great article on the backfire effect and related biases. http://youarenotsosmart.com/2011/06/10/the-backfire-effect/

Hello Squidocto: Straw Skeet Shooting

(To listen to this segment, go to Skeptical Connections.)

My partner in crime and I came up with a useful new term the other day: straw skeets. Or maybe straw skeet shooting.

Either way. You’ve probably guessed, it’s a handy combination of the straw man fallacy and skeet shooting. (I don’t know anything about real skeet shooting, so I’m talking about the cartoon version, where clay discs are tossed in the air one by one, and a shooter takes aim at each and fires.)

You may be familiar with the gish gallop, the term coined by Eugenie Scott of the National Center for Science Education to describe the debate technique of the late creationist Duane Gish. He would barrage his opponent with statement after statement, false facts, straw men, misdirection, one after the other so there was no way anyone could respond to them all. Also called “spreading,” this can leave the impression that the gish galloper has a mountain of evidence, so even if a few points are addressed by the opponent there’s not even a dent in the pile.

So straw skeet shooting is trying to deal with this. Statement, bang that’s a lie; statement, bang that’s out of context; statement, bang that’s a red herring; statement… even this caricature of it is exhausting.

Usage examples:

"I’ve heard her debate before, so I’m prepared to be doing a lot of straw skeet shooting."

"Shooting straw skeets is a waste of time. No more debates for me."

"Wow, you’ve got a whole sack of straw skeets there, don’t you."

"Nah, it’ll be great. I’ll gish gallop, you straw skeet shoot, then we’ll go down the slippery slope and lunch on red herrings by the poisoned well."

Straw skeet shooting. You should adopt the term! Here’s why: it’s memorable, it’s meaningful, it fills a vast void in our language, it’s the most clever term ever, all the kids are saying it, it tastes like armadillo flesh, it smells like snoring…

Hello Squidocto: The Shrug Off

(To listen to this segment, go to Skeptical Connections.)

A while back, in an online argument, my opponent wrote that the person who had recently set fire to a papier-mâché animal during a protest march invalidated the protest—that it proved the political claims of the marchers couldn’t be taken seriously.

I responded: really? If, in a peaceful group of twenty thousand people, one person does something stupid, the whole group’s arguments can be ignored? Their response: yup.

That’s one version of something we all see every day. The shrug off.

You know the shrug-off. It’s when someone says, directly or indirectly, “I don’t have to take them seriously because…”, and here insert any number of things: I don’t have to take them seriously because they got that fact wrong, their grammar and spelling are terrible, they used a logical fallacy, they’re in a cult, they believe in god, this website has been wrong before, they smoke pot, they…

There are so many. I could go on…

It’s kind of a combination of an ad hominem and denying the antecedent, but generalized— it’s more of an attitude than an argument. The version that skeptics are, unfortunately, at danger of falling in to is the Argument from fallacy, or the Fallacy Fallacy.

So you hear someone say “water is a juice, and you can mix juice with whiskey; therefore water can be mixed with whiskey.” And you jump in and you’re like nuh-uh you shark turd water isn’t a juice! The argument is fallacious, so clearly water can NOT be mixed with whiskey.

And you not only look like a jerk, you are in fact a fallacious jerk.

You know, it’s a little difficult to come up with a good example, because a fallacy, out of context and highlighted, is easily recognized. The shrug off happens quickly, in the heat of discussion, or when scrolling through piles of information. The filters or short cuts you use, which you need to keep focused, it’s easy for them to devolve into the shrug off.

And of course, even just using words that have slightly different meanings to different people, can make it seem there’s a fallacious argument, when perhaps it’s just a misunderstanding.

Shrug offs are going to happen. People are going to shrug off what you’re saying without actually wrapping their head around what you’re saying. And yes, you are going to shrug off what someone else says for reasons that have nothing to do with the point they’re trying to make. We’re all bombarded by people saying things. And I don’t care how skilled you are in the skeptical arts, it’s gonna happen.

So, what to do for those who aspire to intellectual honesty? Perhaps an alternative is: shrug-off, shoot the moon. This would be where you treat every statement, by everyone, as fallacious. Of course if you really did that, consciously, you’d be miserable and everyone would hate you. So instead, be skeptical. And this is one reason we’re skeptics: everything anyone says, regardless of their status, low or high, their fame, their intelligence or lack thereof, their menial job, their nobel prizes, every statement might be wrong.

This skeptical stance seems negative to some, not to me (obviously). But one good thing it does is slow you down, it makes you remember that intellectual honesty is really hard.

Snowflake Snowstorm by Squidocto.

Snowflake / Snowstorm
———————————————————

(Studies referenced in this episode.)

A series of studies was recently published in the journal of Psychological Science. The summary, which I admit I’m lifting directly from the press release, is

"Liberals tend to underestimate the amount of actual agreement among those who share their ideology, while conservatives tend to overestimate intra-group agreement"

Briefly, the researchers conducted several surveys on groups of people that asked a combination of questions to establish political leanings and also neutral leanings (such as coffee preference). They also asked to what degree they felt their peers agreed with them on their answers to the various questions. And basically, liberals perceive themselves as unique, even when they’re not, and conservatives perceive themselves as the norm, even when they’re not.

Don’t worry. I’m not going to talk politics. I’m not even saying this study is right— as far as I can tell it hasn’t been replicated yet, though the results certainly make intuitive sense.

Personally I’m a fan of skepticism’s attempts to be non-partisan. While it’s important to remember that any empirical political claim is fair game, the fact that I regularly meet fellow free thinkers who vote completely differently from me is such a relief. And the environment we often create that welcomes disagreement is one of our traditions I hope will trickle out to everyone else.

But whenever I read about studies like this, I always wish, for the questions, there had been an option to choose a skeptical answer, like, you could check the box that says “I haven’t seen enough relevant data to answer this question with sufficient confidence.”

And this got me wondering: how do skeptics know what they agree on? You certainly see people celebrating, and bemoaning, topics of apparent skeptical consensus. So when studies such as these show, as usual, how many different ways there are to be wrong about what you think you know, perhaps they should serve as a reminder to free thinkers: whatever consensus you think exists, or doesn’t exist, in our community… you might be wrong.

I would love to see a study done on us, we who love to think about how we know what we know. For the rational community, I’d be curious to learn: how do we know what we think we know about what we know?

Here’s a haiku:

Plum falls from a tree
And rolls on dew-glistened grass
The skeptic wants proof

New Album: Day Trips

I have a new album!
Day Trips.
It’s a collection of my audio collages.
They’re field-recording-mash-up-portraits I’ve made over the years.

(For those accustomed to my chamber music or songs, this is something different: they’re more like sound sculpture than music. Actually, a friend called them “Almostmusic” and I think I’m going to run with that term.)

You can read some more about it at my website, where there are also links to iTunes and CDbaby and such. (This is a download-only release; it should be available from any of the familiar online music services.)

Yay!

http://www.matthewschickele.com/album-day-trips.html

Explanation Generation by Squidocto.

Explanation generation

—Discussed in this segment: Political Extremism Is Supported by an Illusion of Understanding (Coverage, original study)
—From the podcast Skeptical Connections.

Let’s start with a quote from journalist Tania Lombrozo: “The striking implication, for which the researchers find support, is that getting people to appreciate their own ignorance can be enough to rein in strong opinions.”

It’s been widely covered in skeptical circles: the evidence has been piling up that many people, maybe even most, when confronted with evidence that challenges their beliefs, actually end up feeling stronger that they are right. It’s frustrating, to say the least. You can be diligent, do your homework, arm yourself with citations and papers, and ironically, the more evidence you have, the less likely you are to convince them.

Now, in the Spring of 2013 a paper was published titled “Political Extremism Is Supported by an Illusion of Understanding,” by Ferman, Rogers, Fox, and Sloman. The basic idea is that people with extreme political views tend to be imagining a caricature of something that is actually complex. (This held for people on both the right and the left.) As the authors say, “polarized attitudes are enabled by simplistic causal models.”

I doubt this is news to you. But their experiments have also provided data on what worked to *moderate* those extreme views. And the winner is: asking the person to, in detail, explain the issue, and to lay out, step by step, how the policy in question would actually work. When made to slowly walk through the causal chain they are imagining, people tended to realize they weren’t quite as sure as they thought.

This study got some decent attention, but I think all skeptics should be aware of it. After all, it provides some real (if preliminary) data that is relevant to what we have been calling the ‘tone wars.’

The authors called their technique ‘Explanation generation.’ To me it also smells a bit like the Socratic method.

If I can be so bold as to speculate a little, this study — which is of course far more detailed than I’ve represented here (we’ll link to it, and you should check it out) — this study is evidence that supports listening, and asking clarifying questions, as opposed to making statements of fact or arguing over details. Keep it cool. Listen. Discuss.

Buh-Bye, Uncaring Universe by Squidocto.

(from episode 21 of the Skeptical Connections podcast.)

Hello Squidocto ”Buh-Bye, Uncaring Universe”

As a skeptic and naturalist, I kind of revel in my comfort with the fact that we live in an uncaring universe. So many people are so freaked out by the idea, dislike even considering it, that I’m happy it doesn’t bother me in any way. At least there’s one anxiety I don’t wrestle with.

I know I don’t need to convince you guys, but just for fun, let’s make a comparison. I live in Queens, so I’ll replace “universe” with “island.” (Yes, Queens and Brooklyn are both, physically, part of Long Island.)

If one of my neighbors said to me “But if the island doesn’t care about us, if all this dirt and rock has no feeling, then our lives are meaningless.” I’d just… blink. I wouldn’t know what to say. That would be an utterly absurd thing to believe. Millions of people believe the equivalent, I know, but it’s just weird. 

Nevertheless, I want to let you know why I’m saying buh-bye to the uncaring universe. I mean the phrase, the phrase “uncaring universe.” 

Allow me to quote Matt the Poet from the JREF forum a few years back. He sums up nicely why I’m undertaking this linguistic endeavor:

"By asserting that the universe ‘doesn’t care’ about you, you’re admitting to the idea that it might, and thus already tacitly anthropomorphizing it." —Matt the Poet

Matt’s right, and I’m hereby pledging to abandon the phrases “uncaring universe,” “unfeeling universe,” or any related anthropomuffinizations of the cosmos.

It’s not a big thing, of course, just a form of false dichotomy, really. But it does represent one of the greatest divides between freethinkers and everyone else. Many of the beliefs of theists, new agers and fuzzy thinkers can be boiled down to imbuing the universe with agency, and if I disagree and say that, in fact, I think the universe doesn’t care, I’m putting myself in opposition to them. But I’m not in opposition. It’s not that I think the universe has no feelings, it’s that I think it doesn’t even qualify as a topic of discussion: it’s absurd.

Put another way, what if my neighbor insists that 2+2=fish?

Once again, I just blink. It’s not that I think 2+2 doesn’t equal fish. It’s that, as far as I can tell, 2+2=fish doesn’t mean anything in the first place. (Of course, my neighbor also pities me and worries that, without believing that 2+2=fish, I am immoral and eternally doomed, which is just… what? How does that follow?)

The reputation of freethinkers is plenty bad already. It doesn’t need someone like me, using lazy language that implies a belief I don’t hold, giving bales of hay to those that would build a straw man of us. 

The universe is… neither caring nor uncaring. The universe… is.

Storm King

Audio Collage Portrait of the upstate NY art center.

More details here.

The Wolfman vs. My Mandel-bro by Squidocto.

From the Skeptical Connections podcast.

Approximate transcription:

Imagine some cheesy movie music, then on the screen, in massive letters, appears: 

The Wolfman vs. my Mandel-bro

I watched two TED talks the other day, and boy talk about contrast, I mean contrast in demeanor. 

The two talks were by Stephen Wolfram and Benoit Mandelbrot. The latter is now deceased. Both are highly accomplished, brilliant people who have done work that, frankly, I am not qualified to comment on. (At least publicly.)

But I was so struck by the contrast of the two personalities. First Wolfram. So, here I could either come up with a long list of adjectives using words like arrogant, or I could just say that I really really had the urge to slap him. Then Mandelbrot, what a cutie, I just wanted to curl up next to him and share a nice algorithmic giggle. He’s like a grandfather whose details, when you zoom in on them, are still grandfatherly.

And I found myself less inclined to trust anything Wolfram was saying. As I was taking in his ideas, my brain was automatically tossing in many hmmmms and unlikelys and yeah-rights. Meanwhile Mandelbrot elicited many more awesomes and fascinatings. 

But I can’t check their work, either of them. I can follow the general concepts, but it’s up to someone else to declare thumbs-up or bullshit.

Alas, then I realized, here’s a bias in action. Someone rubs me the wrong way, and automatically my brain turns against them. I’m being influenced by a personality trait that has nothing to do with the quality of their work. That’s bias. 

Often even if you’re aware of it your brain still pops in those declarations of “I know what’s going on, I know the dealio.” It’s always ready with an opinion.

When it comes to life-and-death situations, I think you’d agree that watching a TED talk while eating noodles does not qualify. But what if I was in a truly dangerous spot with the two of them? Just me, the Wolfman and my Mandel-bro. We have only minutes to make the decision that will seal our fates. Would I spend those precious minutes weighing their arguments rationally, or would I waste them fuming in exasperation at the Wolfman’s insistence that not only was his plan the best, but that he in fact had invented the concept of planning in a paper he published when he was only 14.

I guess we’ll never know. And don’t get me wrong, I feel perfectly comfortable being annoyed by the Wolfman, even if I can’t comment on his mathematica or anything. It’s neither here nor there. At least he’s a reminder of the importance of good communication skills, and a reminder to appreciate people who are good at explaining ideas without condescension. And he’s also a reminder that bias can rear its ugly head at any time.

————————-

TED talks:

Stephen Wolfram

Benoit Mandelbrot

16 The Invisible Therefore by Squidocto.

Approximate text:

I feel like there’s got to be a name for this, but I can’t find a name for this. So if there’s a name for this let me know. I call it “the invisible therefore”.

It’s the unspoken implications during a conversation. Consciously or not, it’s what is both silently implied by the speaker and silently understood by the listener. It’s a messy part of our conversational habits. The invisible therefore is when, in your head, you follow a statement with “therefore x.”

So, If you’re speaking, you might say “I don’t like that singer,” which is a simple statement of musical taste. Subjective.

But in the real-time flow of conversation, both you and the listener are likely to mentally fill in the blank, without saying anything out loud.

So you say “I don’t like that singer,” but perhaps you’re also thinking “that singer is not just terrible, they represent what’s wrong with the entire music industry,” or perhaps “their music isn’t even music, it’s just a corporate carrot on a stick,” or perhaps you’re not thinking either.

But also, if the person you’re talking to actually likes that singer, and hears you say “I don’t like that singer,” perhaps they think your invisible therefore is “therefore you have terrible taste in music.”

Ouch. You didn’t say that. But it’s what they heard.

And yes, I hear you. I hear you thinking “Squidocto, where’s your evidence? Are you riffing on a study, or…?” No, I’m not, I’m speculating. But allow me to explain why I think the invisible therefore is not only prevalent and the cause of many headaches, but also a useful idea.

Instead of music, pick any more serious subject. Religion, politics, whatever. In such discussions people famously respond not to what was actually said, but what they think was actually meant… and we have names for this: straw man, slippery slope, any of those logical fallacies. I like to group them as ‘the invisible therefore’ because it links it back to the speaker.

Think about it, the speaker probably *does* have a ‘therefore’ in mind. When talking about a serious topic, I bet you do tend to have an opinion on what the next step should be, what “therefore” should happen. But you almost never say it, because it’s a whole different kettle of fish, it’s a different conversation entirely.

Unlike a logical fallacy, which is always committed by someone, by one participant, the invisible therefore is a shared confusion. 

The speaker makes a point, but stops short of spelling out what they think the implications should be, because that would just complicate the point, and likely derail the argument. And the listener, not wanting to be caught or argued into a corner, quickly runs through the implications of the point made, and, to be safe, assumes that the worst implication is what the speaker has in mind — but, of course, assumes it silently.

I know none of this is new or anything, but I think it’s important to keep it in mind. Cuz you know, straw men and slippery slopes aren’t necessarily the tools of a weak mind, they are defense tactics — if someone gives you an inch, are you going to take a mile? They want to know.

So if you’re making a point, consider making your ‘invisible therefore’ visible. Or at minimum, take a moment to notice whether you have one. If you don’t, that’s important information too. 

For example, I think some of my neighbors are so strictly religious that it’s reasonable to say that their children are oppressed, denied liberty and knowledge that is rightfully theirs. 

Therefore… 

Well, I don’t know. I’m not sure what to do, if anything. But if I didn’t reveal my therefore, even my uncertain therefore, there are hundreds of possible therefores you could have thought I meant, and, brains being what they are, you might have picked one and concluded that, in fact, that’s what I meant.

I’m telling you, it happens all the time. We’re all assuming things about each other, about our beliefs and intentions. We couldn’t stop doing it if we wanted to. 

We speak with invisible therefores, and we hear invisible therefores. Be careful out there.

The more I learn, the more I say Hooray.
"Cento" Copyright © Andrew Brinker 2011.