On a recent evening in San Francisco, Tristan Harris, a former product philosopher at Google, took a name tag from a man in pajamas called “Honey Bear” and wrote down his pseudonym for the night: “Presence.”
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
While some blame our collective tech addiction on personal failings, like weak willpower, Harris points a finger at the software itself. That itch to glance at our phone is a natural reaction to apps and websites engineered to get us scrolling as frequently as possible. The attention economy, which showers profits on companies that seize our focus, has kicked off what Harris calls a “race to the bottom of the brain stem.” “You could say that it’s my responsibility” to exert self-control when it comes to digital usage, he explains, “but that’s not acknowledging that there’s a thousand people on the other side of the screen whose job is to break down whatever responsibility I can maintain.” In short, we’ve lost control of our relationship with technology because technology has become better at controlling us.
Under the auspices of Time Well Spent, Harris is leading a movement to change the fundamentals of software design. He is rallying product designers to adopt a “Hippocratic oath” for software that, he explains, would check the practice of “exposing people’s psychological vulnerabilities” and restore “agency” to users. “There needs to be new ratings, new criteria, new design standards, new certification standards,” he says. “There is a way to design based not on addiction.”
All this talk of hacking human psychology could sound paranoid, if Harris had not witnessed the manipulation firsthand. Raised in the Bay Area by a single mother employed as an advocate for injured workers, Harris spent his childhood creating simple software for Macintosh computers and writing fan mail to Steve Wozniak, a co-founder of Apple. He studied computer science at Stanford while interning at Apple, then embarked on a master’s degree at Stanford, where he joined the Persuasive Technology Lab. Run by the experimental psychologist B. J. Fogg, the lab has earned a cultlike following among entrepreneurs hoping to master Fogg’s principles of “behavior design”—a euphemism for what sometimes amounts to building software that nudges us toward the habits a company seeks to instill. (One of Instagram’s co-founders is an alumnus.) In Fogg’s course, Harris studied the psychology of behavior change, such as how clicker training for dogs, among other methods of conditioning, can inspire products for people. For example, rewarding someone with an instantaneous “like” after they post a photo can reinforce the action, and potentially shift it from an occasional to a daily activity.
Harris dropped out of the master’s program to launch a start-up that installed explanatory pop-ups across thousands of sites, including The New York Times’. It was his first direct exposure to the war being waged for our time, and Harris felt torn between his company’s social mission, which was to spark curiosity by making facts easily accessible, and pressure from publishers to corral users into spending more and more minutes on their sites. Though Harris insists he steered clear of persuasive tactics, he grew more familiar with how they were applied. He came to conceive of them as “hijacking techniques”—the digital version of pumping sugar, salt, and fat into junk food in order to induce bingeing.
Six months after attending Burning Man in the Nevada desert, a trip Harris says helped him with “waking up and questioning my own beliefs,” he quietly released “A Call to Minimize Distraction & Respect Users’ Attention,” a 144-page Google Slides presentation. In it, he declared, “Never before in history have the decisions of a handful of designers (mostly men, white, living in SF, aged 25–35) working at 3 companies”—Google, Apple, and Facebook—“had so much impact on how millions of people around the world spend their attention … We should feel an enormous responsibility to get this right.” Although Harris sent the presentation to just 10 of his closest colleagues, it quickly spread to more than 5,000 Google employees, including then-CEO Larry Page, who discussed it with Harris in a meeting a year later. “It sparked something,” recalls Mamie Rheingold, a former Google staffer who organized an internal Q&A session with Harris at the company’s headquarters. “He did successfully create a dialogue and open conversation about this in the company.”
Harris parlayed his presentation into a position as product philosopher, which involved researching ways Google could adopt ethical design. But he says he came up against “inertia.” Product road maps had to be followed, and fixing tools that were obviously broken took precedence over systematically rethinking services. Chris Messina, then a designer at Google, says little changed following the release of Harris’s slides: “It was one of those things where there’s a lot of head nods, and then people go back to work.” Harris told me some colleagues misinterpreted his message, thinking that he was proposing banning people from social media, or that the solution was simply sending fewer notifications. (Google declined to comment.)
At unplug sf, a burly man calling himself “Haus” enveloped Harris in a bear hug. “This is the antidote!,” Haus cheered. “This is the antivenom!” All evening, I watched people pull Harris aside to say hello, or ask to schedule a meeting. Someone cornered Harris to tell him about his internet “sabbatical,” but Harris cut him off. “For me this is w‑talk,” he protested.
Harris admits that researching the ways our time gets hijacked has made him slightly obsessive about evaluating what counts as “time well spent” in his own life. The hypnosis class Harris went to before meeting me—because he suspects the passive state we enter while scrolling through feeds is similar to being hypnotized—was not time well spent. The slow-moving course, he told me, was “low bit rate”—a technical term for data-transfer speeds. Attending the digital detox? Time very well spent. He was delighted to get swept up in a mass game of rock-paper-scissors, where a series of one-on-one elimination contests culminated in an onstage showdown between “Joe” and “Moonlight.” Harris has a tendency to immerse himself in a single activity at a time. In conversation, he rarely breaks eye contact and will occasionally rest a hand on his interlocutor’s arm, as if to keep both parties present in the moment. He got so wrapped up in our chat one afternoon that he attempted to get into an idling Uber that was not an Uber at all, but a car that had paused at a stop sign.
At a restaurant around the corner from Unplug SF, Harris demonstrated an alternative way of interacting with WMDs, based on his own self-defense tactics. Certain tips were intuitive: He’s “almost militaristic about turning off notifications” on his iPhone, and he set a custom vibration pattern for text messages, so he can feel the difference between an automated alert and a human’s words. Other tips drew on Harris’s study of psychology. Since merely glimpsing an app’s icon will “trigger this whole set of sensations and thoughts,” he pruned the first screen of his phone to include only apps, such as Uber and Google Maps, that perform a single function and thus run a low risk of “bottomless bowl–ing.” He tried to make his phone look minimalist: Taking a cue from a Google experiment that cut employees’ M&M snacking by moving the candy from clear to opaque containers, he buried colorful icons—along with time-sucking apps like Gmail and WhatsApp—inside folders on the second page of his iPhone. As a result, that screen was practically grayscale. Harris launches apps by using what he calls the phone’s “consciousness filter”—typing Instagram, say, into its search bar—which reduces impulsive tapping. For similar reasons, Harris keeps a Post-it on his laptop with this instruction: “Do not open without intention.”
Curious to hear more about Harris’s plan for tackling manipulative software, I tagged along one morning to his meeting with two entrepreneurs eager to incorporate Time Well Spent values into their start-up.
Harris, flushed from a yoga class, met me at a bakery not far from the “intentional community house” where he lives with a dozen or so housemates. We were joined by Micha Mikailian and Johnny Chan, the co-founders of an ad blocker, Intently, that replaces advertising with “intentions” reminding people to “Follow Your Bliss” or “Be Present.” Previously, they’d run a marketing and advertising agency.
“One day I was in a meditation practice. I just got the vision for Intently,” said Mikailian, who sported a chunky turquoise bracelet and a man bun.
At his speaking engagements, Harris has presented prototype products that embody other principles of ethical design. He argues that technology should help us set boundaries. This could be achieved by, for example, an inbox that asks how much time we want to dedicate to email, then gently reminds us when we’ve exceeded our quota. Technology should give us the ability to see where our time goes, so we can make informed decisions—imagine your phone alerting you when you’ve unlocked it for the 14th time in an hour. And technology should help us meet our goals, give us control over our relationships, and enable us to disengage without anxiety. Harris has demoed a hypothetical “focus mode” for Gmail that would pause incoming messages until someone has finished concentrating on a task, while allowing interruptions in case of an emergency. (Slack has implemented a similar feature.)
The biggest obstacle to incorporating ethical design and “agency” is not technical complexity. According to Harris, it’s a “will thing.” And on that front, even his supporters worry that the culture of Silicon Valley may be inherently at odds with anything that undermines engagement or growth. “This is not the place where people tend to want to slow down and be deliberate about their actions and how their actions impact others,” says Jason Fried, who has spent the past 12 years running Basecamp, a project-management tool. “They want to make things more sugary and more tasty, and pull you in, and justify billions of dollars of valuation and hundreds of millions of dollars [in] VC funds.”
Currently, though, the trend is toward deeper manipulation in ever more sophisticated forms. Harris fears that Snapchat’s tactics for hooking users make Facebook’s look quaint. Facebook automatically tells a message’s sender when the recipient reads the note—a design choice that, per Fogg’s logic, activates our hardwired sense of social reciprocity and encourages the recipient to respond. Snapchat ups the ante: Unless the default settings are changed, users are informed the instant a friend begins typing a message to them—which effectively makes it a faux pas not to finish a message you start. Harris worries that the app’s Snapstreak feature, which displays how many days in a row two friends have snapped each other and rewards their loyalty with an emoji, seems to have been pulled straight from Fogg’s inventory of persuasive tactics. Research shared with Harris by Emily Weinstein, a Harvard doctoral candidate, shows that Snapstreak is driving some teenagers nuts—to the point that before going on vacation, they give friends their log-in information and beg them to snap in their stead. “To be honest, it made me sick to my stomach to hear these anecdotes,” Harris told me.
There is arguably an element of hypocrisy to the enlightened image that Silicon Valley projects, especially with its recent embrace of “mindfulness.” Companies like Google and Facebook, which have offered mindfulness training and meditation spaces for their employees, position themselves as corporate leaders in this movement. Yet this emphasis on mindfulness and consciousness, which has extended far beyond the tech world, puts the burden on users to train their focus, without acknowledging that the devices in their hands are engineered to chip away at their concentration. It’s like telling people to get healthy by exercising more, then offering the choice between a Big Mac and a Quarter Pounder when they sit down for a meal.
And being aware of software’s seductive power does not mean being immune to its influence. One evening, just as we were about to part ways for the night, Harris stood talking by his car when his phone flashed with a new text message. He glanced down at the screen and interrupted himself mid-sentence. “Oh!” he announced, more to his phone than to me, and mumbled something about what a coincidence it was that the person texting him knew his friend. He looked back up sheepishly. “That’s a great example,” he said, waving his phone. “I had no control over the process.”
ABOUT THE AUTHOR
Ads by Revcontent
From The Web
Crazy Secrets You Didn’t Know About the ‘Wizard of Oz’
Quiz: Only 1 in 100 Americans Can Guess the Countries? Can You?
15 Chilling Childhood Photos of History’s Most Evil People
They Are Calling Her the “Next Einstein” After This Crazy Discovery