Skip to main content

We shouldn’t trust health studies that let people report what they ate

I was part of a research study and ended up lying on my data

In Letschin, Asylum-Seekers Adapt To A New Life In Germany
Photo by Sean Gallup/Getty Images

Every time I read a health study that trusts people to report what they ate, I want to throw the paper in the trash. People are not to be trusted, and self-reporting is inaccurate to the point of nearly being useless. I know this from personal experience: I was part of a two-year, nationally funded weight-loss study. All they asked of me was that on one day about every six months, I tell them what I ate the day before. I went in with good intentions — well, you know what they say about those. I ended up lying on my report.

That doesn’t make me bad; actually, it makes me normal. Most of us eat mindlessly, without noticing quite how much we’re ingesting. Estimating calories is a skill that comes naturally to few;  ditto, estimating portion sizes. We’re so bad at these things, in fact, that the FDA proposed changing the food stats on calorie labels to more accurately reflect what Americans actually eat. It’s because we’re so bad at noticing how much we eat that people trying to lose weight always hear that they should keep a food diary. It’s only after being forced to write everything down that we are shocked into knowing what we put in our mouths. And even those diaries aren’t, strictly speaking, accurate.

I was game at first, but by the third visit I had given up

This everyday laziness undermines a lot of nutrition research, which is often based on self-reported estimates. These studies are often structured like the SMART health study, a two-year trial in college students funded by the National Heart, Lung and Blood Institute. I joined the study in 2012; it was designed to see if people would lose more weight when they were able to use social media tools and electronic encouragement. For example, a friend who was also enrolled was instructed to go to the gym four times a week to unlock a “message” someone sent him on the study website. Others got texts or were told to use mobile apps developed by researchers for the study.

I was in college and had just moved off campus, which led to a steady diet of takeout Mexican food. This, combined with a new friendship that seemed to revolve solely around spending afternoons baking, made me gain a lot of weight.  Soon, I weighed more than I ever had; I panicked about that, and so the study was appealing. It seemed like a win-win: maybe I’d lose weight more quickly, plus I would get paid. Because I ended up in the control group, all I needed to do was get measured about five times over two years.

At each visit, a nurse took my height, weight, blood pressure, and some other measurements. Then, the part I dreaded: reporting what I ate the day before. The study used a very clunky interface, complete with a Clippy-like penguin who “helped” me. The process took far too long.

A pop-up menu asked me to enter details from every meal. It was not simply a matter of marking “Honey Nut Cheerios, breakfast” and “Chipotle burrito, lunch.” I had to input the type of meal, exact time of day, where I was, if I was alone, and if I was using the TV or computer while eating. I had to scroll through what felt like an endless list of brands, none of which seemed to be the right one. I barely remembered if I had actually eaten lentils the day before, much less how many exactly. When it came to drinks, I had to remember how many glasses of water I drank, when I drank them, the size of the glass, and the amount of water in each glass. (There were visuals to help me judge.)

I was game, at first. By the third visit, I had given up. I didn’t care enough about the study to meticulously take notes the day before. I didn’t want to scroll through yet another option of what kind of jam I put on my toast. I made up meals that seemed reasonable and, more importantly, would let me finish quickly and get out of the clinic and back to class.

Though I now feel embarrassed that I played fast and loose with research data, I don’t think my experience was unique. Asking for that level of detail requires far more patience than the average college student had. And I wasn’t even asked to estimate calories.

People are lazy, forgetful, and optimistic

The problem extends to exercise, too. A slew of studies show that we think we exercise more than we do. In fact, this is another reason weight loss often fails: we overestimate exercise and then reward ourselves with ice cream that cancels out the entire spin class. Take the example of a recent JAMA study that found that people using fitness trackers lost less weight. All participants had to self-report what they ate and how much they exercised during this trial. Based on the self-reports, people in the two groups ate and exercised about the same amount — and yet, one group lost less weight. This is not possible. If we are to believe this self-reported result, we must stop believing in the laws of thermodynamics.

The results of my study came out recently in the Lancet, years after my last measurement. Since this study didn’t focus primarily on what people ate, thankfully my indiscretions didn’t really affect the results the way they would have for a food-based study. People who were assigned to use the social media tools had lost more weight after six months. But at the end of the two years, nobody’s weight was significantly different from the numbers at the beginning. One possible explanation is that nobody wanted to use the social media tools after a while, a finding that is consistent with high rates of abandonment for wearable trackers.

My results, in the end, were an exception because I didn’t end up at my original weight. Before the trial was done, I had already lost the weight I gained from too many baked goods. But my biggest takeaway wasn’t about social media and weight loss. It’s that people are lazy, forgetful, and optimistic. If we want actual data on what works, we can’t trust people to tell us themselves.