Smartphone Addiction The Slot Machine in Your Pocket
Smartphone apps are addictive -- by design. They take advantage of human weaknesses to ensure your constant attention. But there is another way.
When we get sucked into our smartphones or distracted, we think it's just an accident and our responsibility. But it's not. It's also because smartphones and apps hijack our innate psychological biases and vulnerabilities.
I learned about our minds' vulnerabilities when I was a magician. Magicians start by looking for blind spots, vulnerabilities and biases of people's minds, so they can influence what people do without them even realizing it. Once you know how to push people's buttons, you can play them like a piano. And this is exactly what technology does to your mind. App designers play your psychological vulnerabilities in the race to grab your attention.
I want to show you how they do it, and offer hope that we have an opportunity to demand a different future from technology companies.
If you're an app, how do you keep people hooked? Turn yourself into a slot machine.
The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices? One major reason why is the number one psychological ingredient in slot machines: intermittent variable rewards.
If you want to maximize addictiveness, all tech designers need to do is link a user's action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.
Does this effect really work on people? Yes. Slot machines make more money in the United States than baseball, movies, and theme parks combined. Relative to other kinds of gambling, people get "problematically involved" with slot machines three to four times faster according to New York University professor Natasha Dow Schüll, author of "Addiction by Design."
A Sense of Belonging
But here's the unfortunate truth: Several billion people have a slot machine in their pocket.
When we pull our phone out of our pocket, we're playing a slot machine to see what notifications we have received. When we swipe down our finger to scroll the Instagram feed, we're playing a slot machine to see what photo comes next. When we "Pull to Refresh" our email, we're playing a slot machine to see what email we got. When we swipe faces on dating apps like Tinder, we're playing a slot machine to see if we got a match.
Sometimes this is intentional: Apps and websites sprinkle intermittent variable rewards all over their products because it's good for business. Other times, for example with email or smartphones, it's an accident.
Another way technology hijacks our minds is by inducing the 1 percent chance we could be missing something important. But Apps also exploit our need for social approval. When we see the notification "Your friend Marc tagged you in a photo" we instantly feel our social approval and sense of belonging on the line. But it's all in the hands of tech companies.
Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically suggesting all the faces we should tag. So when my friend tags me, he's actually responding to Facebook's suggestion, not making an independent choice. But through design choices like this, Facebook controls the multiplier for how often millions of people experience their social approval.
The same happens when we change our main profile photo. Facebook knows that's a moment when we're vulnerable to social approval: "What do my friends think of my new pic?" Facebook can rank this higher in the news feed, so it sticks around for longer and more friends will like or comment on it. Each time they like or comment on it, we get pulled right back in.
Everyone innately responds to social approval, but some demographics, in particular teenagers, are more vulnerable to it than others. That's why it's so important to recognize how powerful designers are when they exploit this vulnerability.
LinkedIn is another offender. LinkedIn wants as many people creating social obligations for each other as possible, because each time they reciprocate (by accepting a connection, responding to a message, or endorsing someone back for a skill) they have to come back to linkedin.com where they can get people to spend more time.
Like Facebook, LinkedIn exploits an asymmetry in perception. When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIn's list of suggested contacts. In other words, LinkedIn turns your unconscious impulses into new social obligations that millions of people feel obligated to repay. All while they profit from the time people spend doing it.
Welcome to the empire of social media.
Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make "free" choices, while we ignore how our choices are manipulated upstream by menus we didn't choose in the first place.
This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose.
When people are given a menu of choices, they rarely ask: "What's not on the menu?" Or: "Why am I being given these options and not others?" "Do I know the menu provider's goals?" "Is this menu empowering for my original need, or are these choices a distraction?"
For example, imagine you're out with friends on a Tuesday night and want to keep the conversation going. You open Yelp to find nearby recommendations and see a list of bars. The group turns into a huddle of faces staring down at their phones comparing bars. They scrutinize the photos of each, comparing cocktail drinks. Is this menu still relevant to the original desire of the group?
Even When We're Not Hungry
It's not that bars aren't a good choice, it's that Yelp substituted the group's original question ("where can we go to keep talking?") with a different question ("what's a bar with good photos of cocktails?"). Moreover, the group falls for the illusion that Yelp's menu represents a complete set of choices for where to go.
The more choices technology gives us in nearly every domain of our lives (information, events, places to go, friends, dating, jobs), the more we assume that our phone is always the most empowering and useful menu to pick from. But is it? "Who's single to go on a date?" becomes a menu of faces to swipe on Tinder (instead of local events with friends, or urban adventures nearby). "Who's free tonight to hang out?" becomes a menu of most recent people who texted us. "What's happening in the world?" becomes a menu of news feed stories.
Companies maximizing "time spent" design apps to keep people consuming things, even when they aren't hungry anymore. How? Easy. Take an experience that was bounded and finite, and turn it into a bottomless flow that keeps going.
Cornell professor Brian Wansink demonstrated this in his study showing you can trick people into keep eating soup by giving them a bottomless bowl that automatically refills as they eat. With bottomless bowls, people eat 73 percent more calories than those with normal bowls.
Tech companies exploit the same principle. News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave.
It's also why video and social media sites like Netflix, YouTube or Facebook autoplay the next video after a countdown instead of waiting for you to make a conscious choice.
Tragedy of the Commons
Tech companies often claim that they're just making it easier for users to see the video they want to watch, when they are actually serving their business interests. And you can't blame them, because increasing "time spent" is the currency they compete for.
Companies also know that interruption is good for business. Given the choice, WhatsApp, Snapchat or Facebook Messenger would prefer to design their messaging system to interrupt recipients immediately instead of helping users respect each other's attention, because they are more likely to respond if it's immediate. It's in their interest to heighten the feeling of urgency. For example, Facebook automatically tells the sender when you "saw" their message, instead of letting you avoid disclosing whether you read it. As a consequence, you feel more obligated to respond.
The problem is: Maximizing interruptions in the name of business creates a tragedy of the commons, ruining global attention spans and causing billions of unnecessary interruptions each day.
Are you upset that technology hijacks your agency? I am too. I've listed a few techniques but there are literally thousands. Imagine bookshelves, seminars, workshops and trainings that teach aspiring tech entrepreneurs techniques like these. Imagine rooms of engineers whose job every day is to invent new ways to keep you hooked.
I didn't write this to depress you, or make you think that our only choice is to unplug completely. It doesn't have to be an all-or-nothing choice. Do we want a world where we either use smartphones and constantly get hijacked, or we can't use them at all?
It's inevitable that billions of people will have phones in their pockets, but they can be designed to serve a different role than deliver hijacks for our mind.
We have an opportunity to demand a different future from the tech industry. Just like the Organic food movement let us demand a different future from industrial agriculture that includes health and sustainability. I call it "Time Well Spent."
The 'Time Well Spent' Internet
Instead of maximizing "time spent" (in the name of advertising), imagine if apps offered alternative, hybrid/paid versions of services that maximized "time well spent" and were ranked that way in search and app stores. Imagine if, instead of just releasing shiny phones each year, Apple and Google designed phones to protect minds from getting hijacked and empower people to make the conscious choices. Imagine if there was a digital "bill of rights" outlining design standards for apps and websites -- for example, design standards that forced apps to give people a direct way to navigate to what they want (look up a Facebook event), separately from what the apps want (without getting sucked into the news feed). Imagine if companies had a responsibility to reduce slot machine effects by converting intermittent variable rewards into less addictive, more predictable ones with better design. For example, they could empower people to set predictable times during the day or week for when they want to check "slot machine" apps, and correspondingly adjust when new messages are delivered.
Imagine if tech companies helped us proactively tune our relationships with friends and businesses in terms of what we define as "time well spent" for our lives, instead of in terms of what we might miss. Imagine an independent organization that represented the public's interests -- an industry consortium of diverse experts or an FDA for tech -- that helped define those standards and monitored when technology companies abused these biases.
Imagine if web browsers and smartphones, the gateways through which people make their choices, were truly watching out for people and helped them forecast the consequences of clicks. When you put the "true cost" of a click in front of people, you're treating your users or audience with dignity and respect. In a "time well spent" Internet, choices could be framed in terms of projected cost and benefit, so people were empowered to make informed choices by default, not by doing extra work.
The ultimate freedom is a free mind, and we need technology that's on our team to help us live, feel, think and act freely.
We need our smartphones to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first. Let's protect our minds with the same rigor as privacy and other digital rights.
Tristan Harris, 31, is co-founder of the movement for Time Well Spent , a magician and an expert on how technology hijacks our psychological vulnerabilities. Until 2016, he was a product philosopher at Google, where he studied how technology affects a billion people's attention, well-being and behavior.