Thursday, December 1, 2016

When statistics are meaningless



Low-quality evidence renders statistics meaningless

John Ioannidis, noted canary in the coal mine of bad science, just put an article on ResearchGate that caught my attention. Late on November 30, 2016, he posted the latest article on which he was a co-author. The article is called “Diet, body size, physical activity, and the risk of prostate cancer.” Here’s the abstract, and here’s the full article. This article reviews the meta-analytic evidence regarding the risk factors for prostate cancer.

The findings summarized by the abstract? 176 out of 248 meta-analyses used continuous exposure assessment to measure the impact of each factor. Of those 176, none satisfied all of the authors’ pre-set criteria for using the best meta-analytic methods to provide strong evidence of the factors linked to prostate cancer. Not one.

The authors graded the strength of evidence in these meta-analyses according to the following categories: strong, highly suggestive, suggestive, and weak. The only strong, reliable risk factor for developing prostate cancer? Height.

For every 5 additional centimeters in height, the risk of developing prostate cancer increases by 4%.
  • Quick, somebody, feature the headline:
    Does Being Tall Give You Cancer? Shocking New Research Shows That Taller Men Are More Likely to Develop Prostate Cancer
How are my clickbait-headline-writing skills?

...Okay, the scientist in me demands that I present a more serious and evenhanded treatment of the topic. So, I’ll report that there is also some evidence to suggest that BMI, weight, amount of calcium in the diet, and alcohol intake are also factors that appear to impact prostate cancer development.

However, the authors did emphasize in the abstract that “...only the association of height with total prostate cancer incidence and mortality presented highly suggestive evidence...” The other factors I listed above are “supported by suggestive evidence.”

But, considering the reflections on the state of biomedical science that Ioannidis published in February of 2016, one wonders just how “suggestive” that evidence really is!

I think this represents a good applied example of why an understanding of stats is important in today’s world! I think it’s also a good example of how easily the competition for funding can corrupt proper scientific procedures! But, lest you think I’m trying to pick on biomedical research, here’s another example that hits frighteningly close to home.

My takeaway: No matter how advanced your statistical techniques or how powerful your software, statistics are meaningless when the evidence itself is biased...

Wednesday, November 30, 2016

A practical guide to Psych Stats



I've previously found the document "Reporting Statistics in Psychology" highly useful, and so I made a presentation for a stats course that I think is worth sharing! My own guide, a supplement of sorts, goes into a slightly broader variety of topics than the previous link, and mine also lists a 'bottom-line' approach that I think will be helpful to the people who just want to know what they should do!


Mine is called "A practical guide to Psych Stats," and I've made a freely available, freely downloadable PDF of that presentation here.
https://drive.google.com/open?id=0B4ZtXTwxIPrjUzJ2a0FXbHVxaXc


This is probably going to be useful to you if any of the following are true:

Early-career/inexperienced students:
  • You've been unsure which test is appropriate for a certain dataset
  • You've struggled to understand psych stats from a conceptual perspective
  • You've struggled to write up statistical results in APA style
  • You've wished there was an easier-to-use stats program
  • You've wished there was a free stats program that you can run on your own computer
 More experienced/advanced students:
  • You've thought that null hypothesis significance testing (NHST) procedures didn't make sense
  • You think that the APA's reporting standards for statistical tests aren't stringent enough
  • You're not sure how to interpret standardized measures of effect size
  • You want to know a little bit more about Bayesian statistics
  • You're not sure how to interpret your Bayesian statistics
  • You're looking for a free/better/more user-friendly/more widely-compatible stats program to run on your own computer
Instructors:
  • You're looking for a quick, easy, free, relatively brief resource to guide your students through the morass that is psych stats
    • Bonus: links are embedded! :D
      However, for best effect, you must download the PDF, as the online preview version may randomly insert characters that will break the links :(
 Enjoy, and I hope you find this helpful!

Wednesday, October 26, 2016

Under Construction

Welcome to Fearless Psychological Science, a professional blog with my thoughts about scientific methodology, statistical analysis, decision research, and other psychology topics that are near and dear to my heart.

The recent conflagrations about replication in psychology and the "methodological terrorism" piece prompted the creation of this blog, as well as the longstanding debate over whether psychology is a "real science."

The URL and title of this blog should make my position on the last topic quite clear. I hope to show that I'm right, and, more importantly, why I'm right.

Stay tuned as I dive fearlessly into these sticky topics. I intend to drop some truth bombs, and I hope you join me! And, since I'm fearless, I hope you correct any mistakes I make with equal fearlessness! We're all here to share, learn, and improve science together--and that means calling me out when I'm wrong :)

In the meantime, you can read my scholarly work at the official journal of the Society for Judgment and Decision Making here, and my mass-media work for Psych Central here.

*Note: Please be aware that I made a mistake in the Psych Central piece; it was actually Dana Carney--not Amy Cuddy--who stated that she no longer believes in 'power pose' effects. Thanks to the commenters, 'Helpful Guy' and 'H Douglas,' for correcting my mistake so quickly!*

For the 'rules of the road' and information on cookies/privacy, please see my 'About' page.

Also, you can follow me on Google Plus or Twitter for updates!

Edit to add (11/3): A detailed and insightful piece on the "methodological terrorism" debate can be found here, with some solid points for both sides of the discussion. For what it's worth, you can compare the leaked draft version of Dr. Fiske's article (openly accessible copy here) to the published version here. As far as I can tell, some of the words themselves have been changed, but Dr. Fiske's overarching argument has not.

A countervailing view, along with a reasonably thorough history of the recent 'replication crisis' in psychology can be found on a blog post by statistician Andrew Gelman. Tal Yarkoni also provides a well-thought rejoinder.

Test 2

The next test is here.

Test

This is a test of the template

ResearcherID