Five things to remember when reading reports on scientific studies

June 4, 2012 10:29 am
  • Pin It

Nothing's black and white

We see it over and over again: A new health study is published in a peer-reviewed medical journal, and it’s picked up by the media (thanks to the accompanying press release), summarized by reporters, and then given a provocative headline by their editors.

I recently came across a press release from the National Institutes of Health, announcing the results of a study reported in the New England Journal of Medicine that concluded that “coffee drinkers have lower risk of death.”

The researches followed over 400,000 men and women from 1995 to 2008, aged 50-71 at the outset.  They excluded people who already had cancer, heart disease, and stroke. Coffee consumption was assessed — just once — at the start, via self-report.  They then looked at how many people died in the 13 years that followed.

I want to use this particular study today, not to discredit it, but rather, to use it as an example of how things can get twisted and overblown in the reporting and analysis.

1. Correlation does not equal causation

Just because people drink coffee doesn’t necessarily mean it’s the coffee that helps them live longer.  This study showed only a correlation.  In fact, the study showed that coffee drinkers overall actually had a higher risk of death:  “In age-adjusted models, the risk of death was increased among coffee drinkers.”

They then point out that coffee drinkers were more likely to smoke (why don’t we wonder if coffee caused them to smoke?) — but after they adjust for “tobacco-smoking status and other potential confounders, there was a significant inverse association between coffee consumption and mortality.”

So what’s that saying?  It means they have to do some statistical analysis to mathematically try to remove smoking and other probably causes of death from the equation. (In this case, they also note that the mathematical results were similar for a subgroup of people who never actually smoked — so that’s a good indicator that their math is probably correct).

Ultimately, though, the conclusion says this: “Whether this was a causal or associational finding cannot be determined from our data.”

So it does not mean that if you drink coffee, you’ll live longer because you drink coffee.  It means that people who drink coffee happen to live longer (and it doesn’t even say how much longer). It may have nothing to do with the coffee itself — it may just be the type of person who drinks coffee is more likely to live longer, for some completely other reason.

2. Headlines can be just plain wrong

In today’s incredibly fast, not-so-detail-oriented news cycle, it’s the headline that gets the most attention.  The nitty-gritty details of the articles (and studies) are often missed.  Some news outlets are very careful with their words, but others tend to be a bit more alarmist — and therefore, probably more inaccurate.

Take these headlines, for example, about the coffee study:

  1. Coffee Drinkers Have Lower Risk of Overall Death, Study Shows (ABC News)
  2. Can Coffee Help You Live Longer? We Really Want To Know (NPR)
  3. Study: Coffee lowers risk of death (The Columbus Dispatch)
  4. Coffee Lowers Disease Risk: Study (The Daily Beast, which is part of Newsweek)
  5. Coffee-drinking lowers risk of death, big study finds (The Boston Globe)
  6. Coffee Shown To Reduce Risk Of Death: Study (NBC Miami)
  7. Coffee Reduces Death Risk (About.com)

I’d say that ABC’s headline (#1) is probably the most accurate to the study’s findings, since they say “Coffee Drinkers” instead of Coffee.  NPR (#2) dodges the problem by turning it into a question. The remaining five are completely wrong interpretations of the study.

As I mentioned above, coffee was not found to causatively lower the risk of death — the study authors specifically point out there was only a correlation!  But when you read those headlines, it’s hard not to think, “Great! I’ll drink more coffee so I’m less likely to die!”

3. Reporting is an interpretation

Just as the headlines can be misleading, so can be the reporting.  By definition, if you’re reading an article from a news outlet, you’re reading a reporter’s interpretation of the study.  Good reporting, of course, will be very careful to be factually correct.  But sometimes little details can shift, ever so slightly, and when it comes to studies like this, little details can make large differences. Even in large, reputable news organizations.

As a simple example, a Los Angeles Times article says “And the link was stronger in coffee drinkers who had never smoked.”  That’s sort-of correct, but not technically accurate.  As I mentioned earlier, the study found that the coffee drinkers who also smoked had a higher risk of death, but when mathematically adjusted to account for the smoking, the association between coffee and death was “similar” (according to the abstract on the NEJM website).

My point is that once a study is interpreted by someone else, the details may shift, even just slightly, and in something where correlation and causation are already fuzzy, it can make a big difference in how we, the casual readers, interpret the findings.

4. Study authors and funders are often biased

I don’t think this is necessarily the case in this particular coffee study, but it bears pointing out: It’s really, really important to know who is paying for a study, and who designed and conducted it.  Although ethical standards require disclosure for conflicts of interest, they don’t require recusement the way a judge needs to recuse himself or herself from a trial. The prevailing practice is that disclosure is enough — as if knowing that someone is biased is enough to eliminate the bias itself.

The reputable medical journals are, of course, generally good at disclosing these conflicts of interest.  But oftentimes those details are buried, or omitted altogether, by the time it gets to the mainstream news.

If the authors of this coffee study owned stock in Starbucks, would that change your opinion of their findings?  Probably on an intellectual level, yes.  But I bet there would still be that little voice in the back of your mind that now thinks it’s a good idea to drink more coffee.

5. Percentages are incredibly misleading

This is the biggest pitfall, in my opinion, because it is so often overlooked:  Studies report the amount of change as a percentage of difference from the original statistic.

In this case, the study found that “relative to men and women who did not drink coffee, those who consumed three or more cups of coffee per day had approximately a 10 percent lower risk of death.”

Okay, so coffee drinkers were 10% less likely to die.  Here’s the important part: That doesn’t mean the likelihood of dying was fully 10% less than before. What it means is that it was a 10% change from the original likelihood — not a 10% change in the the overall statistic.

In the coffee study, they tracked 229,119 men.  During that time, a total of 33,731 of them died. So the overall odds of dying for a man in this group was 14.7%.

The study found a roughly 10% change in the death rates of coffee drinkers.  Ten percent of 14.7% is 1.47%.  So, a little bit of math reveals that the coffee drinkers had a 13.23% “chance” of dying — which is reported as a 10% improvement. But really, it’s just a 1.47% improvement overall.

So all these headlines that claim “Coffee reduces death risk”?  It means that if you’re a coffee drinker aged 50-71, you have a 13.23% chance of dying over the next thirteen years instead of a 14.7% chance.

(These numbers are looking at overall totals — in reality we’d need to split it up into sub-groups, like smoking-coffee-drinkers and non-smoking-coffee-drinkers, and how many cups of coffee they drank each day… but I’m simplifying a bit to make the point.)

This may be statistically significant, and certainly is a strong enough result to warrant further research into the potential death-defying properties of coffee, but it hardly warrants bumping your daily coffee intake to four or five cups, like the headlines would lead you to believe.

So next time you’re reading about a study, keep all this in mind, and take everything with a big grain of salt.*

* But not actual salt, since that’ll raise your blood pressure and correlate to a 15.2763% greater chance of death…

Photo by Alvin Trusty.

27 Comments on "Five things to remember when reading reports on scientific studies"
  1. Comment left on:
    June 14, 2012 at 6:14 am
    Betsy says:

    Sooooo maybe I’ll skip my yahoo homepage news from now on. ;) In all seriousness, I’m glad you wrote this because its so true, and it doesn’t just apply to scientific studies. I think a lot of what the media puts out is taken as black and white facts

  2. Comment left on:
    June 14, 2012 at 12:46 pm
    Alexandra says:

    YES YES YES. Thank you. I’ve had a twinge in my proverbial back about this subject for months and I’m so glad you did something about it. I’ve found that either the study is difficult to interpret or just plain inaccessible unless I pay; even if I can see the abstract and methods, it’s usually ambiguous and doesn’t tell me the source of funding. It’s hard for the consumer. I hate to say it but I’ve just started ignoring nutritional and health studies for the most part :(

  3. Comment left on:
    June 14, 2012 at 6:28 pm
    Dennis says:

    A really good source for summaries of(mostly) medical study results are academic reviews of papers on medpagetoday.com. Often you can find nutritionally related studies, as well, all with a brief overview, and they do give the publication reference and sources of funding so you can decide possible conflicts of interest for oneself.

  4. Comment left on:
    June 18, 2012 at 10:08 am
    Jackie says:

    This is so important for people to learn. When I took statistics in college, two things stuck with me and changed the way I read research. The first one was my professor saying, “The most important thing to ask whenever reading a study is: Who funded the research! Because a heart disease study funded by Philip Morris and one by The American Heart Association will yield very different results.”
    The second was when he showed a study that correlated ice cream consumption with crime. It was a way to show that the two rise during holidays and summer months and how easy it is to link two things that are clearly not related, a way to illustrate that correlation does not equal causation.

  5. Comment left on:
    July 20, 2012 at 3:21 pm

    Just stumbled upon your site. Very nicely done!

    I think you make several good points in this article. In the name of “science”, a lot of junk is published. And of course, the media is ready to jump on the first thing that comes out sounding contrary to the popular belief.

    Bad science + Bad reporting + Catchy headlines = Confused public

    It is important to check the credibility of the source and question if the study paper was peer-reviewed. Cross check with other published studies to figure out if the data analysis and conclusions make sense. Even then, remain skeptical.

    Stay hungry, stay foolish!

  6. .
    July 21, 2012 at 2:10 pm

    [...] fact he has written a sort of layman’s guide to reading between the lines of scientific reporting.   <—  This is my favorite article on his entire site and is something I refer to often. [...]

  7. Comment left on:
    July 22, 2012 at 1:02 pm

    Great points! Wanted to let you know I linked it up for my Weekend Wanderings post.

    • Comment left on:
      July 22, 2012 at 6:54 pm
      Andrew says:

      Thanks Andrea! :)

  8. Comment left on:
    August 9, 2012 at 2:53 pm
    Kay says:

    Nice to see someone else who reads these reports and findings with a questioning mind!

  9. Comment left on:
    October 17, 2012 at 10:41 am
    Andrew says:

    Came across another perfect example of #5 today: ABC News reported on a new study that found lower rates of cancer for people who take multivitamins (vs. placebo). ABC said it’s an 8% lower risk of cancer, but that’s incredibly misleading. If you look at the actual study in JAMA, you’ll see that the net result was 18.3 incidents of cancer per 1,000 person years for placebo, versus 17 incidents with the multivitamin. A change from 18.3 to 17 is indeed 8%, but this reporting makes it sound like you’re 8% less likely to get cancer if you take multivitamin.

    In reality, the study showed that taking a multivitamin resulted in 1.3 fewer incidents per 1,000 person-years. So taking a multivitamin is really shifting the odds in your favor by 0.13% – not by 8%.

Leave A Comment
Name (required)
Website Url (completely optional)
XHTML: feel free to use any of these tags.

Seeing unhealthful or otherwise icky ads? Please let me know.
© 2010-2014 Andrew Wilder / Eating Rules — All Rights Reserved.