Why Michael Volunteers His Statistical Power


As the only statistician that anyone I know seems to know, I get a lot of questions from family and friends about research studies that they’ve seen on the news, or have heard about. I’m sometimes able to give advice on the spot, but often I like to look at the studies and see whether they make sense or not.  They often don’t.  Or at least the bottom line that is reported on the news is not what the study actually says.  AT ALL!

A couple of years ago, Ms. Britton and I were getting brunch when she turned to me:

“I know you use Listerine, you really shouldn’t.  I just read that using Listerine increases the risk of oral cancer.”

Yes, I use Listerine, and no, I’m not worried about oral cancer.  I replied: “What did the study actually say, because it sounds like [words that aren’t appropriate for a blog post]”.

I asked her to direct me to the article that she read so I could take a look at the study it was based on.  She directed me to this article: Mouthwash linked to cancer

Which then led me to this article:

The role of alcohol in oral carcinogenesis with particular reference to alcohol-containing mouthwashes

This article is very clear in their conclusions as they state: “There is now sufficient evidence to accept the proposition that developing oral cancer is increased or contributed to by the use of alcohol-containing mouthwashes.”

Pretty clear?  Right?  Well not so much. If we look further to the studies that this one references, the two main ones were: Alcohol-containing mouthwashes and oral cancer. Critical analysis of literature


Oral Health and Risk of Squamous Cell Carcinoma of the Head and Neck and Esophagus: Results of Two Multicentric Case-Control Studies

These two articles were essentially the linchpins for mouthwash as a cancer risk.”  Pretty Damning, I mean two peer-reviewed published journal articles.  One concludes: With the data we have, it has been impossible to establish a causative relation between mouthwash use and the development of oral cancer.”  The other concludes: “Our mouthwash results should be interpreted with caution, as they are limited by our recording only the frequency of use.”

WAIT!!! So how does the main article conclude, fairly emphatically, that: “There is now sufficient evidence to accept the proposition that developing oral cancer is increased or contributed to by the use of alcohol-containing mouthwashes”

It doesn’t. There might be some evidence, but even the researchers who first reported on it back away from the results quite a bit as they’re not anywhere near conclusive.

It’s like a jury convicting an alleged murderer based on an eye witness who said: “Yeah, I saw that guy there. He could have done it. He was looking shifty. There were a lot people standing around looking shifty, though. Maybe he didn’t have anything to do with it. I mean it could have been anyone. Ok, I actually didn’t really see anything.  Can I go home now?”

So what’s the problem here and how does it relate to Better Bio? Well in 5 minutes I just went from a news article, with a peer-reviewed scientific study backing it up, which concluded that mouthwash causes cancer, to the peer-reviewed scientific study saying the same thing, to the primary research sources, which contradict the strong claims of the former.

The problem is that no one in the chain of people who ok’d this story took the 5-minutes to go back and see what was really going on. That’s not reporting.  It’s sensationalizing.  The alternative isn’t sexy.  It’s not interesting to say, “There is a possibility that mouthwash causes cancer, but we really don’t know.”

We have a breakdown of our idea of what is Science. Science is not the study of facts. Science is the study of questions.  It’s the study of plausible explanations. When one explanation seems more reasonable than another we accept that explanation as “True.” If we collect more evidence or data, and find out that our explanation is no longer the most reasonable one, then we must change what we think is “True.”

We, meaning humans, used to think that the sun revolved around the earth.  This was a very reasonable hypothesis.  If one looks outside everyday, the way the sun rises and falls, it’s a more sensible explanation to say that the sun revolves around the earth, than the alternative. Humans have charted the movements of the stars (and visible planets) since the beginning of time. The geocentric system was created because it fit our observation.  The problem arises when data contradict theory. And they do. So the more we looked at the sky, the less plausible geocentrism became. Eventually it became more plausible to accept the heliocentric view of the solar system.

The thing is, it doesn’t mean that it’s true. As far as we can see, that is how the solar system works. If our observation started showing us information that contradicted our theory, then we must abandon our current theory in favor a more plausible one!

Health science seems to be afraid of this. We saw, earlier, that a little study with a small claim became a huge headline.  The truth is that we want theories and explanations so bad that we’re too willing to make claims that the data don’t necessarily support.

We must always ask: Is there something more going on here? Is there another way of explaining these results?

I ask my self these questions whenever I read science articles because I’m conditioned to.  I’ve noticed that headlines do either of two things: Scare the heck out of you or tell you that if you do X then you can live forever. Always the extremes, never nuance.

Why Better Bio? Well, we’re inundated with health news everyday, and fear sells. There is so much information and so many claims being made and sifting through the…the…[again, another word that’s not appropriate for a blog] is hard work.

With a little more knowledge on how to ask the simple questions we can all do the sifting. We can all find the snake oil salesmen. And, best of all, we can all relax a little more because we’ll be more skeptical of claims like: