Addiction can be a difficult thing to see. From outward appearances, Dr. Zoe Chance looked fine. A professor at the Yale School of Management with a doctorate from Harvard, Chance’s pedigree made what she revealed in front of a crowded TEDx audience all the more shocking. “I’m coming clean today telling this story for the very first time in its raw ugly detail,” she said. “In March of 2012 … I purchased a device that would slowly begin to ruin my life.”
At Yale, Chance teaches a class to future executives eager to know the secrets of changing consumer behavior to benefit their brands. The class is titled “Mastering Influence and Persuasion,” but as her confession revealed, Chance was not immune from manipulation herself. What began as a research project soon turned into a mindless compulsion.
Chance admits she has a weak spot for games, at times, playing until it hurts. “I have had problems with video games,” Chance tells me. “Basically, I can’t have any video game on any computer or phone or anything that I own because the first day that I have it, I just stay up playing it until my eyes bleed.” But during her presentation, Chance was not confessing her addiction to a video game. She was admitting her dependency to something much more benign but which had nonetheless controlled her mind and body. “They market it as a ‘personal trainer in your pocket,'” Chance said. “No! It is Satan in your pocket.” Chance was referring to a pedometer. More specifically, the Striiv Smart Pedometer.
Too Much of a Good Thing?
Admittedly, having rabid users is not a problem most companies face. The much more common dilemma is the opposite: a lack of customer engagement. However, Dr. Chance’s story brings up an ethical dilemma for companies seeking to change user behavior — is there ever too much of a good thing? Although addiction is most often associated with physical dependencies and controlled substances, behavioral addictions can be just as powerful and destructive. Gambling addiction for example, haunts an estimated 2 million Americans and is formally recognized in the Diagnostic and Statistical Manual of Mental Disorders.
Behavioral addictions can wreak havoc on people’s lives and millions suffer from problems related to online gaming, pornography, and other digital dependencies. Even when companies build products intended to help their users (like a gamified pedometer) there is potential for abuse, which begs the question: What responsibilities do companies have to protect users from themselves? Should companies like Facebook and gaming apps take measures to counteract the addictive properties of their products? If so, how?
The Addiction Paradox
The trouble is this: The attributes that make certain products engaging also make them potentially addictive. There is no way to separate the fun of gaming, for example, with its potential for abuse. Social media is exciting principally because it utilizes the same variable rewards that make slot machines compelling. Spectator sports or television watching, enjoyed by billions of people, share common traits with the primary function of illicit drugs — they provide a portal to a different reality. If what we’re watching is engaging, we experience the high of being mentally elsewhere.
For proof, just visit your local sports bar during a big game. Instead of watching the match, watch the people. Look at the fans’ faces. Watch as the flickering images on the screens transport them across the digital cables and satellite signals to a different place. During the most exciting parts of the game, when their eyes stare with laser intensity on what is about to happen, they’ll be in a state similar to what cultural anthropologist and MIT professor Natasha Dow Schüll calls, “the zone.” Though Schüll used the term to describe what drives gambling addicts, the zone defines the same exhilarating mental state experienced through other mediums.
To be clear, games, spectator sports, and social media are wonderful things. Few of us, myself included, would want to live in a world without them. These forms of entertainment provide harmless fun for the vast majority of people who enjoy them. However, each also has a potential dark side.
Dr. Chance’s experience with the Striiv pedometer provides a telling example. According to Striiv’s CEO, David Wang, the device is meant to help inactive people get moving by collecting points as they walk. Wearers of the pedometer use their points to build virtual worlds in the company’s Farmville-like app called “MyLand” but Wang admits some people, like Chance, go overboard.
As with many obsessions, it was only a matter of time before Chance went too far. At the end of one particularly active day, Chance said she received a late-night notification from the app challenging her to walk a few flights of stairs. She accepted the challenge and marched quickly down and back up the steps leading to her basement. As soon as she completed the first challenge, she received another offer — walk four more flights and the app promised to triple her points. “Yes, of course! It’s a good deal!” Chance thought.
That’s when Chance says she lost control. For the next two hours she walked the stairs to and from her basement like a fitness-crazed zombie. By two in the morning, she had climbed over 2,000 stairs — far surpassing the 1,576 steps to climb to the top of the Empire State Building.
It wasn’t until the next morning that Chance felt the repercussions of what she had done. The Striiv spree left Chance with a painful hangover. The incessant jarring had strained her neck making normal walking painful and her Striiv-motivated fitness goals impossible. “When the neck injury happened I had to take a break,” Chance said. “Which allowed me to finally acknowledge what my husband had been saying for a while — that I had a problem.”
Though Dr. Chance’s story reveals the power some technologies have on some people, the fact is most people do not get hooked. According to Wang, only 3% of users walk the 25,000 daily steps Chance was logging. Like most potentially addictive products, the percentage of users who meet the definition of engaging in, “continued repetition of a behavior despite adverse consequences,” is very small. Studies show that relatively few people are susceptible to addiction and that even those who do get addicted quit their dependencies when they change life stage or social context. However, there are people who can not stop, even when they want to.
The current thinking among technology companies is that too much is never enough. Limitless time scrolling Facebook feeds, hours of pinning on Pinterest, days spent leveling-up in online games — there are no bounds. In the name of personal responsibility, companies leave addicts alone, never interrupting their zone-state cocoons.
While a laissez-faire customer relationship is the right approach for most users, when the user is addicted — no longer in control and wanting to stop but can’t — companies should intervene. Standing idly by while reaping the rewards of users abusing their product is no longer acceptable — it is exploitation. Thankfully, and for the first time, makers of potentially addictive products have the power to do something.
Outside of consumer tech, companies making potentially addictive products can claim ignorance regarding who is abusing their products. Makers of alcoholic beverages for example, can throw up their hands and claim they have no idea who is an alcoholic. However, any company collecting user information can no longer take cover under the same excuse. Tech companies know exactly who their users are and how much time they are spending with their services. If they can hypertarget advertising, they can identify harmful abuse.
In fact, some companies have already started limiting certain features to people who overuse their sites. StackOverflow, the world’s largest technical question and answer site, deprecates certain features to limit use. Jeff Atwood, the company’s co-founder, says the system was designed to not only improve the quality of content on the site but also to protect susceptible users. “Programmers should be out there in the world creating things too,” Atwood writes, making the point that he wants StackOverflow to be a utility, not a mindless distraction.
Use and Abuse
Though Dr. Chance says she was able to kick her behavioral addiction to the Striiv pedometer on her own, it took physical pain to help her realize she had lost control. “The blessing of the stairs episode was the neck injury.” It wasn’t long after she realized she had a problem that Chance decided to give up using the Striiv for good. She vowed to never use the pedometer again and mailed it to her sister far away in Massachusetts.
When it comes to potentially addictive products, companies should not wait for users to harm themselves. Companies who know when a user is overusing their product have both an economic imperative and a social responsibility to identify addicts and intervene.
Companies have an obligation to establish what I call a, “Use and Abuse Policy,” which sets certain triggers for intervention and helps users retake control. Of course, what constitutes abuse and how companies intervene are topics for further exploration, but the current status quo of doing nothing despite having access to personal usage data is unethical. Establishing some kind of upper limit helps ensure that users do not abuse the service and that companies do not abuse their users.
Here’s the gist:
– When it comes to potentially addictive products, there are two types of users — those who use the product under normal parameters and those who abuse it.
– Addiction is defined as, “continued repetition of a behavior despite adverse consequences.” Generally, a relatively small number of users form a technology addiction.
– However, companies who have information on which users are abusing their service have a moral responsibility to intervene.
– Drafting a “Use and Abuse Policy” that sets the upper limits of normal use and defines what constitutes harmful overuse, is good for both the company and its users.
What do you think? Should companies building potentially addictive products have a “Use and Abuse Policy”? What else can companies do to limit abuse of their products?