Sci Snippet – Do iPhones kill people?

Look at this graph. It’s incredible! As iPhone sales increase from 2007 to 2010, so did the number of deaths caused by people falling down the stairs.  They even increased at the same rate.  It’s crazy – iPhones cause people to die falling down the stairs!!


For this and other hilarious graphs of silly correlations see

How obviously ridiculous this is.  Most people could logically deduce that just because iPhone sales and deaths from falling down the stairs “correlate” that one does not “cause” the other. There are lots of other examples (and you can create your own silly correlations) from Spurious Correlations or on other news sites here or here.  And in these cases, you can laugh realizing that just because these two things happen together does not mean that one causes the other.

But even though you’ve probably said that “correlation doesn’t imply causation” or heard someone say it, in science (and science news reporting) the difference is critical to tease out. Why? Understanding if something in science is a cause of the effect you’re seeing can

  • Prevent something harmful from causing damage. For example knowing that smoking is a cause of lung cancer resulted in public health efforts to help people quit smoking.
  • Treat a the cause of a disease or fix the cause of the problem.  For example, knowing that the h.pylori bacteria causes gastritis and ulcers provides a method for treating ulcers by killing the h.pylori. Or if you know that factory waste being dumped into a river or lake causes animal life to die or stop procreating, you can work towards stopping the dumping to save wildlife.
  • Prepare for diseases, outbreaks of disease, or natural disasters.  If you know that earthquakes cause tsunamis, then a warning system can be developed to save people in the tsunami zone.
  • Plan ahead and discuss the possible outcomes from a particular action.  When you know that lack of water in a drought causes dry forest conditions leading to forest fires, you can plan to have greater funding available for fighting these fires in a particularly dry year.

“Cancer smoking lung cancer correlation from NIH” by Sakurambo – Vectorized version of Image:Cancer smoking lung cancer correlation from NIH.png, originally published on the website. The source page has been deleted, but an archived copy is still accessible.Own work, created in Adobe Illustrator. Licensed under Public Domain via Commons

It’s just as important though to tease out when something doesn’t cause an effect – and unfortunately many false claims and pseudoscience is based on taking correlations and touting them as causes.  So how do we figure this out?

In science, a lot of this is determined experimentally and statistically.  In statistics (of which I do not claim to be an expert, but see the links below for more details), the strength of the relationship can be calculated and the stronger the more likely that one causes the other.  The cause/effect relationship should also be tested experimentally, if possible, and the experiment should be repeated to see if the same results are obtained every time.  Without experimental or repeatable experimental results, the relationship is less likely to be causal.  Another interesting measurement is to look at the time frame – if the action takes place months, days, or years apart from the effect, you have to consider whether this would make sense or not.  In the case of smoking and lung cancer, the separation of the two events by years makes sense, but in other cases it may not.  Which also brings up the point of looking at the relationship and thinking about whether or not it makes sense or if a mechanism can be found for the cause and effect relationship.  For example, we know that smoking causes DNA mutations and inflammation which is one of the mechanisms that leads to lung cancer. Alternatively, looking at the iPhone and dying from falling down the stairs example, it’s difficult to find a mechanism that could explain this relationship.

A great description of what to look for comes from the book club book that I’ll be talking about on Thursday, “Bad Science: Quacks, Hacks, and Big Pharma Flacks” by Ben Goldacre with a quote describing evidence-based medicine:

it needs to be a strong association, which is consistent, and specific to the thing you are studying, where the putative cause comes before the supposed effect in time; ideally there should be a biological gradient, such as a dose-response effect; it should be consistent or at least not completely at odds with what is already known (because extraordinary claims require extraordinary evidence); and it should be biologically plausible

Overall, it does come down to the data and some common sense.  If there isn’t any data to support the relationship, you might just be looking at correlation and can confidently holler “iPhones do not kill people!”

To better understand the differences between correlation and causation and the math that can show which you are looking at, check out the Kahn Academy course.  Read more about this topic from Stats with Cats Blog

For more Sci Snippets, click here.

Does the media sensationalize terrible science? Oh, yes.

The other day, I was drinking a glass of wine with a friend of mine, and she mentioned a story that she recently read in Scientific American called “Changing Our DNA Through Mind Control.”  She was excited to tell me how scientists had found that decreasing your stress can actually change your DNA! This was fascinating!  She was excited!  I was excited!  But being the scientist that I am, I wanted to understand what, how and why, so I needed more information. To gather this information, I looked at the original scientific article in the journal Cancer. (see here for my post on what a scientific article looks like).


Thank you Wikipedia for the image

The researchers had taken breast cancer patients and split them into two groups – one group went to mindful  meditation classes and the second group did not.  The scientists then took a sample of their DNA isolated from the patient’s blood and tested the length of their chromosome’s telomeres. To remind you, all people have 23 pairs of chromosomes in each and every cell. Telomeres are found at the ends of the chromosomes and protect the chromosomes from being damaged (essentially eaten away at from the ends by DNA-chomping proteins inside the cell).  A common comparison is to think of telomeres like the plastic bits at the end of shoelaces. The shorter the telomeres (or the shorter the plastic bit), the closer the chromosome (or the shoelace) is to being damaged.   In this study, the researchers found that the telomeres in cancer patients who went to meditation were longer than the patients that didn’t . Because of this, their DNA was better protected.  How incredible!

How unbelievable. Unbelievable for a few reasons:

  1. The researchers looked at the length of telomeres over a three month time span.  Telomeres shorten over a lifetime, so I wouldn’t expect to see a significant change over 3 months (whether sick, well, stressed, or not stressed).
  2. Because of this reasoning, I looked carefully at the data they presented in the paper.  None of it was statistically significant.  There is a trend that showed that patients that did not go to meditation were slightly, on average, a tiny bit lower than patients that went to mediation, but nothing that convinces me that what they are seeing is real,  In fact, even the paper’s authors said that if they wanted to have enough patients to get to statistical significance in the results, they would need to do a bigger study with more than double the number of participants.
  3. As a final nail in the coffin, in the psychological analysis comparing the mood of the meditating versus non-meditating patients, the researchers didn’t notice a change in stress or mood scores.  So even if there was a change in the DNA (which there isn’t), since their mood doesn’t change, a decrease in stress cannot be the cause of the telomere/DNA changes.
I’m not saying that mood doesn’t have the ability to affect your DNA – maybe it does.  I just don’t think that they showed it in this study.

Thank you Wikipedia for the image

But I don’t want to harp on these researchers or this study. In fact, here’s another interesting example. A science journalist, Dr. John Bohannon, recently wrote a scientific article based on an actual study that they did studying people’s diet and how chocolate contributed to their weight loss. They found that eating chocolate once a day significantly increased weight loss!  The data was real, they published the results, and the media picked it up like wildfire.  It was published by news media around the world!!! Only problem – the conclusions they they drew were crap, and Dr. Bohannon published them with the intention of baiting the media.  In this case, there were too few participants, so they found something that was “statistically significant” in this group of people but wouldn’t necessarily pan out if there was a full, well-designed study. John Bohannon wrote a great blog post about this whole experiment and why this is the case.

So what’s the message here?  I think the first is that the media often looks for science that can create a striking, head-turning headline.  The problem is that when the conclusion is so cool, journalists don’t always read the original article or evaluate the data to make sure that this cool headline is supported by evidence in the publication.  To be fair, journalists may assume that since other scientists already reviewed the paper for scientific accuracy (a process called “peer review”) that it will be good to go.  But just because a stranger hands you a drink in a bar and says its okay, should you just believe them that it doesn’t contain Roofies?  I also don’t want to imply that all science journalism falls into this trap, but with ever shortening deadlines and competition for the “hot headline,” I can only imagine how appealing it is to take shortcuts.

To conclude, I actually have a problem in writing this post in that I don’t have a solution for you, my reader.  Unlike the friend I was having drinks with, you may not have a scientist at your beck and call to vet all news stories for scientific accuracy.  And as much as I hope that this blog is helping you to obtain your own PhD in biology, you won’t have all the tools you need to evaluate scientific articles on your own.  So maybe I will leave you with a tried an true saying “Don’t believe everything you read” or maybe “If it sounds too good to be true…” it might be.

I’m not the only person who has written about this topic.  If you’re interesting in reading more, check out this article from NPR, numerous articles from Ben Goldacre about how science is misrepresented in the media compiled on Bad Science, and a different point of view, an article published in Salon about how just because someone is a scientist doesn’t mean that they are an expert (especially if they are on Fox News).