By Alvaro Carrascal, MD, MPH
You may have seen some of these headlines recently in national newspapers and online:
More coffee linked to higher mortality rate: study
Four cups of coffee a day may raise early death risk in younger adults
4 Cups of Coffee a Day Can Be Deadly
New Study: Coffee Can Kill You
Under 55? Think twice before you reach for that extra cup of coffee, researchers say
After seeing these reports on the web and morning news, I had a thought as I reached for my morning cup of joe: Should I consider tea instead?
As a person who grew up drinking 2-3 cups of coffee a day, should I change my habits based on these news reports? What would happen if I don't? Should these reports stop my life-long friendship with Juan Valdez?
Then, I began to recall headlines from a few months ago.
Caffeine Linked to Lower Skin Cancer Risk
Moderate Coffee Consumption Lowers Heart Failure Risk
Moderate Coffee Consumption May Reduce Risk of Diabetes by Up to 25 Percent
What should we make of these contradictory reports? Who is right, who is wrong, and what should we believe?
Every day we are bombarded with reports about studies and occasionally these studies contradict each other. With constant activity on blogs, social media, and mobile technologies, there is an explosion of information available at our fingertips.
It should not be a surprise that the public is confused when trying to make sense of the media's ever-increasing amount of health news. Many people search the Internet looking for answers, but these are not easy to find. Many Internet resources do not have accurate or believable information, and what they share could misguide the public.
There are a few things we have to keep in mind when we see these reports and try to decide what to do about them.
All studies are not equal
One of the most important things to take into account is the type of study that is being reported. Not all studies are created equal, and studies that are well-designed and well-performed are more credible and respectable. In broad terms, there are 2 kinds of studies:
Observational studies: In these studies, researchers simply monitor individuals and measure certain outcomes without any attempt to change or alter the outcome. There is no intervention or treatment. For example, researchers may do a "cohort study," where they follow a group of individuals for 20 years, taking note of their risk factors, like smoking and physical activity, and looking at the impact of these factors on their long-term health. This creates different groups of people, like people who smoke or people who don't work out, and researchers observe the differences between the people in each group. The recently reported coffee studies were this kind of study. Since people in a cohort can differ in many ways, the results need to be carefully interpreted. For example, some people who drink coffee may also smoke and this would impact their chance of getting ill or dying. In short, observational studies show association, like when we say coffee is associated with a higher risk of death, but they cannot prove the cause-effect relationship, in this case, that coffee causes death.
Experimental or intervention studies: In these studies, participants are subjected to something to evaluate its impact. It could be a medical intervention, such as a drug, or an intervention in behavior, like increasing physical activity. These studies generally involve at least 2 groups: one with the intervention and the "control" group without the intervention. Among intervention studies, the randomized clinical trial (RCT) is considered to have the most thorough design and to have more believable results. In an RCT, researchers assign subjects randomly to the experimental/intervention group or to the control group. Both groups are followed and later compared. RCTs could involve more than 2 groups. Different treatment regimens may be compared at once. In the field of cancer, RCTs are most commonly used to study treatment but can also be used to measure whether or not something (foods, exercise, etc.) prevents cancer.
Finally, there are two very important types of research compilations:
Systematic reviews: With these, researchers perform rigorous reviews of all relevant studies that other researchers have already done on a given treatment or intervention. The key element is that conclusions should be based on all published literature on a given topic, not just on a small, possibly biased, selection of studies.
Meta-analyses: A meta-analysis includes a statistical review of data from separate but comparable studies. Meta-analyses can include intervention and observational studies. It is generally accepted that a meta-analysis of several RCTs offers better evidence than a single trial.
News reports are imperfect
Another thing to keep in mind is that news reports are imperfect vehicles to get out information about scientific reports. I say "imperfect" for a few reasons:
1) It is difficult to translate a complex issue, such as the result of a study, into a few sentences or sound bites and still take into consideration all the aspects involved, like the type of study, number of people in study, how it was done (methodology), etc.
2) Even though there are specialized reporters focusing on these issues, sometimes these reporters may not know the nuances of medical research. Additionally, the standard editing process may sometimes remove critical information or headlines may hype findings to increase readership.
3) An increasingly shorter news cycle requires a constant stream of information. This means reporters or editors may cover a study with very little relevance so they can fill space or airtime. It also may mean they have little time to prepare each story.
4) There is an interest on the part of the media, scientific journals, pharmaceutical companies, researchers, research funders, and academic centers in being part of new and "big" study reports. This is a recipe for exaggeration.
A single news story, or even several, cannot provide enough information for making decisions about your health.
Evaluate like the pros
For health care professionals and researchers, the credibility of any finding depends on the type of study that produced it. RCTs, systematic reviews and meta-analyses are at the top of the pyramid in terms of the strength of study designs. They are believed to produce the more robust conclusions.
Likewise, which journal published a study is important to understanding its credibility. This says a lot about what type of review the article went through before it was published. Look for respected sources, like the Journal of the American Medical Association (JAMA) or the New England Journal of Medicine. Here's a good resource from my colleague Ted Gansler, MD, to get a sense of what is a high-quality or low-quality journal.
We should also remember that the results of any single study do not change our scientific understanding. To create "new" knowledge, many studies conducted in different settings and with diverse groups of people need to produce the same results. In other words, findings from a particular study need to be replicated by other researchers in order to be considered "fact."
With all those things in mind, and in the absence of conclusive evidence, I will continue to consider Juan Valdez my friend.
Dr. Carrascal is Vice President, Health Systems for the American Cancer Society Eastern Division.