Low-quality evidence renders statistics meaningless
John
Ioannidis, noted canary
in the coal mine of bad science, just put an article on
ResearchGate that caught my attention. Late on November 30, 2016, he
posted the latest article on which he was a co-author. The article is
called “Diet, body size, physical activity, and the risk of
prostate cancer.” Here’s the abstract, and here’s the full article. This article reviews the
meta-analytic evidence regarding the risk factors for prostate
cancer.
The
findings summarized by the abstract? 176 out of 248 meta-analyses
used continuous exposure assessment to measure the impact of each
factor. Of those 176, none satisfied all of the authors’ pre-set
criteria for using the best meta-analytic methods to provide strong
evidence of the factors linked to prostate cancer. Not one.
The
authors graded the strength of evidence in these meta-analyses
according to the following categories: strong, highly suggestive,
suggestive, and weak. The only strong, reliable risk factor for
developing prostate cancer? Height.
For
every 5 additional centimeters in height, the risk of developing
prostate cancer increases by 4%.
-
Quick, somebody, feature the headline:
Does Being Tall Give You Cancer? Shocking New Research Shows That Taller Men Are More Likely to Develop Prostate Cancer
How
are my clickbait-headline-writing skills?
...Okay,
the scientist in me demands that I present a more serious and
evenhanded treatment of the topic. So, I’ll report that there is
also some evidence to suggest that BMI, weight, amount of calcium in
the diet, and alcohol intake are also factors that appear to impact
prostate cancer development.
However,
the authors did emphasize in the abstract that “...only the
association of height with total prostate cancer incidence and
mortality presented highly suggestive evidence...” The other
factors I listed above are “supported by suggestive evidence.”
But,
considering the reflections on the state of biomedical science that
Ioannidis published in February of 2016, one wonders just how
“suggestive” that evidence really is!
I
think this represents a good applied example of why an understanding of stats is important in today’s
world! I think it’s also a good example of how easily the
competition for funding can corrupt proper scientific procedures!
But, lest you think I’m trying to pick on biomedical research,
here’s another example that hits frighteningly close
to home.
My
takeaway: No matter how advanced your statistical techniques or
how powerful your software, statistics are meaningless when the
evidence itself is biased...
No comments:
Post a Comment