Fitter, Happier, More Productive: The Promises and Failures of Self-Tracking Technologies

We have reduced the notion of health to a set of standards that tend to be binary, arbitrary, or both.

by Dana Cass on July 2nd, 2015

Jenny Jones biked 16.7 miles with Strava. Div Chaudhuri checked in at Crossfit Key Bridge. Kara Miller invited you to the group 90 DAY CHALLENGE!!!!!!

Your Facebook feed doesn’t lie: the Quantified Self movement is experiencing a moment. Wearables—the key to the movement’s success—are skyrocketing in popularity. Researchers at Scripps have rewritten Moore’s Law for the wearable age: “a doubling every 5 years of the number of mobile devices connected to the Internet, leading to approximately 50 billion in 2020.” SparkPeople, where you can count and track your calorie intake and exercise and participate in forums, boasts 15 million members; the cycling app Strava raised $18.5 million in Series D funding last October. If it’s a bubble, it’s a big one.

Three different smart watches on a single wrist.

Photo CC-BY TeppoTK, filtered.

Self-tracking promises to help you “unlock your potential,” “make a change,” even learn “what makes life worth living.” You can download an app that tells you whether today was a good day and one that tracks your posture. You can buy a fork that tells you if you’re eating too fast. You’re thinner, better-slept, straighter-backed. You’re happier and healthier. Your Facebook friends will know how many miles you biked. You won’t up and die of a heart attack one day like both of my grandfathers did. And all you have to do is punch a few numbers into your cell phone or wear a bracelet. Wait a few more years and all you’ll need is an implant.

Here’s the problem: the Quantified Self movement and the industry that’s sprung up around it are predicated on a subjective definition of health and healthy behavior. The tech industry has responded by developing apps and hardware that cast an intimate and narrow gaze on our health and activity, creating a power dynamic that threatens us with the psychological distress associated with surveillance.

We have reduced the notion of health to a set of standards that tend to be binary, arbitrary, or both. Take fatness: for several decades now, there’s been a stigma against being fat. This was once a matter of aesthetic preference, but “health culture” has turned it into a moral judgment. Never mind that being fat is not synonymous with having high cholesterol or diabetes, or that thinness can be a symptom of illness. Within health culture, fatness is perceived as visible evidence that you’re unhealthy.

From here, the rhetoric expands to state that the health of the individual is a shared concern. This stems mostly from the theory that unhealthy individuals raise the cost of healthcare for everyone—and from there, if you like, it’s easy to go down the Obamacare rabbit hole, and suddenly your taxes are paying for my Big Mac. Now, we’ve arrived at the conclusion that if I’m fat, I’m unhealthy, and therefore I’m negatively affecting your quality of life.

Cue the tech industry, coming in to save the day, with apps that make it easy to fix your unacceptable problems by embedding them into your everyday life. Surveillance culture follows swiftly to seize this opportunity and bore another hole into our personal lives.

Human pulse sensor on the back of a FitBit.

Photo CC-BY Billie Ward, filtered.

A lot has been said, and with good reason, about how self-quantification exposes us to unwanted surveillance by state actors. But what about social surveillance? The act of tracking yourself with an app that someone else built means that you’re putting your data in the hands of someone who is going to use it to write a story. Maybe that story will be shown only to you, to tell you that you’re unhappier than your peers; maybe your data will be a few points in an infographic about how many glasses of water Americans drank this year. Whatever the purpose, the act of committing your information to a database controlled by a third party exposes you to the psychological stresses of surveillance. Several studies have found that people under surveillance in some capacity report increased feelings of stress, depression, and loss of control, among others.

We also behave differently when we know we’re being watched. In many ways, our behavior under surveillance can be interpreted as an improvement. Research shows that the practice of digital self-quantification—specifically, using your smartphone to track your diet and activity—can promote success in a weight-loss program. Weighing these possible outcomes, we can choose to suffer the consequences of surveillance in order to reap the benefits of better behavior. “Better” is subjective, though; more accurately, we’re behaving how health culture tells us to behave. This is in large part because technology is amplifying the voice of health culture — a voice that we trust uncritically and potentially to our detriment.

We look at technology as an authority, and so we do what it tells us. (I think of that old poem—“Eye halve a spelling chequer / It came with my pea sea”—that portends how our unconditional belief in technology can fail us.) When our watches tell us that fatness and depression and poor posture are unacceptable, we believe it. Technology doesn’t take into account nuance–and neither do the social and corporate structures that prop up our current fascination with health.

The companies that design and sell tracking apps and wearables are as diverse as any other startup in the industry–which is, to say, they’re not. Their boards, founders, and executive teams are experts in designing and selling technology products. They’re the same people who made Flappy Bird and Venmo: mostly men, mostly white, from privileged backgrounds, with the confidence you need to win hearts and wallets in Silicon Valley. They sold us on swiping right and now they’re selling us on a vision of health that they have defined on our behalf.

It’s a simplistic vision that relies on assumptions made by the largely homogeneous upper class in Silicon Valley. The lack of diversity in VC funding means that, by and large, the definition of health that drives self-tracking apps and wearables is what’s healthy for young, wealthy, white men. They assume that medical advice from a 24-year-old with a degree in computer science and recreational interest in, say, nutrition is appropriate for indiscriminate and broad distribution.

A running track with the shadow of a runner cast on it.

Photo CC-BY Chris RubberDragon, filtered.

That’s unlikely. The Atlantic made some good and relevant points last year when they called out some notable omissions in Apple’s Health tracker. Consider, too, how many apps put undue emphasis on weight loss, reduction in calorie intake, and increasing athletic performance. The startup field is also more of a sandbox for testing out ideas than a scientific institution. Take Thync, which that claims electrical stimulation can catalyze a mood shift of your choosing (you can choose energy, for example, or calm). It was developed by neuroscientists, but their peers say the science is questionable at best. The science backing apps and wearables is frequently little more than theory and good marketing copy. For most of the consumer market, though, technology is shrouded in an air of mystery that protects it from scrutiny and impels us to trust it implicitly.

Neil Richards wrote in 2013 that surveillance creates “an effect on the power dynamic between the watcher and the watched.” When we purchase, use, and trust apps and devices designed by the tech hegemony, we allow them to assert their dominance. As the psychological effects of surveillance begin to take effect, we grow stressed and eager to please, and we follow their guidance unthinkingly in order to reap the benefits we’ve been promised. Silicon Valley, with one eye on “making the world a better place” and the other on bringing in profits, designs products to exploit this power dynamic, using tactics like sharing and gamification to increase retention.

Nowhere is this more insidious than in self-tracking apps, where sharing and gamification quickly give rise to obsession. You’re trying to lose weight—of course you are, because if the 24-year-old developer who built your app had his way, every woman would lose ten pounds—and your fitness app is congratulating you the more calories you burn, and so you burn more calories, because your fitness app isn’t going to say “Hey, two runs a day probably isn’t a great way to spend your time.” You’re tracking the rise and fall of your moods and squinting at a sparkchart of your postural changes over the past 48 hours and you’re keenly aware of how you’re falling short relative to the 68% of the population of Flywheel riders who ride with a higher average resistance than you do.

Fitness wearable and accompanying mobile app, displaying heart rate and fitness data.

Photo CC-BY N i c o l a, filtered.

This is no way to live. And if you extrapolate the initial statistics on the longevity of the market, it looks like it might be the death of wearables, too.

A survey published in 2014 found that more than half of respondents who owned an activity tracker had quit using it. That same report recommended that device manufacturers and app makers “exploit” social sharing to improve retention. But what if that’s what’s driving users away? Feeling like you’re being watched? Feeling your quality of life, on the whole, decrease when you obsessively track your every move, thought, and calorie? Feeling like there’s no point because you’re never going to measure up to these standards of perfection anyway?

More than a century ago, Samuel Warren and Louis Brandeis wrote in their seminal treatise on privacy that “modern enterprise and invention have, through invasions upon [man’s] privacy, subjected him to mental pain and distress, far greater than could be inflicted by mere bodily injury.” Today, we are exposed to pain and distress by the structures of power in the tech industry that have warped the definition of health to meet their vision, often ignoring science in the process.

And that’s a shame, because there’s good to be found in the practice of self-tracking. Both of my grandfathers died young of heart attacks; if using a Nike+ watch to track his heart rate means my dad will be around for another 40 years, then I’m all for it.

But where’s the balance? How do we harness the benefits of self-tracking without forcing harmful and ultimately counterproductive methods of motivation onto users? Is there any universe in which we can disentangle tracking apps from surveillance without making them a totally unsustainable proposition? Can we design apps that allow for a more nuanced view of health: encouraging people to understand fullness cues rather than counting calories, perhaps, or minimizing the emphasis on weight altogether? How can we discourage uncritical trust in apps designed by technologists without medical expertise?

These aren’t easy questions. I expect at some point that there will be a backlash against the pseudo-science behind our current market for wearables and tracking apps. As that market dies down, we’ll be left with those tools that have staying power: likely ones based in science that don’t impose a negative psychological effect on their users. And as venture capital slowly continues to diversify, perhaps the market will begin to cater to a wider variety of consumers with devices and apps that more accurately reflect their needs and their individual health.

Regardless of market behavior, tracking is no passing trend: Quantified Selfers point to Ben Franklin and his 13 virtues as their progenitor. Our opportunity now is to relentlessly question and challenge what we’re buying and who’s making it. We can encourage VCs to fund devices based on legitimate science and designed for a broader population of users. And we can fight our instinctive and implicit trust of the technology market, and the structures that prop it up.