Select Page

Nir’s Note: This Q&A recently appeared on the 15five.com blog and it pulled out some thoughts I’ve been chewing on regarding technology, addiction, and our relationship with the products we use. I’ve edited it slightly and hope you find it interesting.

Should We Worry About the World Becoming More Addictive?

Question: Pokémon GO is all the rage right now. Can you talk about that in the context of a habit forming product? Is it negative or positive?

Nir Eyal: We have to think of technology in the broader context of the environment that we live in. The knee jerk reaction that always occurs with a new technology is that we don’t like it. We are averse to change and fear new technologies.

When you think about Pokémon (which is a lot less revolutionary than other technologies) in the context of what else people could be doing with their time, I think it’s pretty good. Pokémon can be considered one of the first mainstream fitness apps with wide appeal, that just so happens to be disguised as a game. You can’t play it sitting in your living room. Compare it to Clash of Clans or Candy Crush, that are not social and very sedentary.

Q: People are so engrossed in the game that they are actually getting injured. What is the difference psychologically or physiologically between habit and all-out addiction? Is Pokémon an addiction or just lack of awareness.

NE: A habit is just a behavior done with little or no conscious thought, about 40% of the behavior that you do every day is impulsive. Habits can be good or bad, but addictions by definition are always bad. An addiction is a persistent compulsive dependency on a behavior or substance that hurts the user.

It’s not a rule that any sufficiently good and popular technology will form an addiction in its user. If a user is not harmed and the behavior is not something that the user can’t stop without assistance, then it’s not an addiction. When we look at Pokémon GO, it doesn’t really pass that test. It’s enjoyable, engaging, and habit forming. And yes, some folks won’t be able to stop and will become addicted. What to do about them is a different ethical question.

The good news is that for the first time in history, people who are making products that are habit-forming can mitigate the harm. Addiction is nothing new, but now the maker of a habit-forming product knows who the addicts are. Distillers of alcohol don’t have that much insight into the identities or behaviors of end users, so there’s not much they can do for them. But if they want to, companies who create products like Facebook, Instagram, and Pokémon GO can do something to help people addicted to their products.

As I work with these companies and consult them, I know that those addicted are a small number, only 1-2% of the population. But for those small percentage of users I think these companies do have an ethical responsibility, and I increase awareness of this issue to encourage them to do something about it.

For most of us however, what most people flippantly call an “addictive product,” like Pokemon Go or Facebook, is just an engaging products. But would we want it any other way? No, we want products that we enjoy using. The vast majority of people know when they are using these products too much, and they opt to self-regulate.

Q: In Hooked you raise the ethical question of manipulation. Research is emerging that overuse of social media (if not outright addiction) has negative impacts on mental health. Couldn’t Google, Facebook, and Twitter get ahead of inevitable tech-burnout by advocating for something like an hour of downtime each day, even factoring the disruption to the revenue stream?

NE: What we’re seeing already is the proliferation of what I call attention retention devices – technologies specifically designed to block out the triggers and distractions from other technologies. Here are some examples that I use:

DF YouTube is a Chrome browser plugin that gets rid of all of the videos on the sidebar of YouTube. This prevents me from watching one video after another.

Facebook News-feed Eradicator prevents something engineered to suck me in, from distracting me.

– I never read an article on my desktop, I always save it to Pocket. The app removes all of the ads and links to other articles. I reward myself when going to the gym by listening to these articles later.

– I use Freedom, to block my internet while I’m writing. This prevents me from checking email or doing research when I should be thinking to get my work done.

Companies would be wise if instead of making it so difficult to leave sometimes, they would make it easier to moderate use. Instead of users burning-out and abandoning altogether, these companies can help us moderate.

This is the challenge of our generation, the first to grow up with interactive technology from birth. We are struggling with trying to figure out how to put tech in its place, even though it’s great and interesting and meets our needs so well.

Information today is no longer scarce. I’m a Gen-Xer, and when I was applying to college I received pamphlets in the mail boasting the size of their libraries. Today information is abundant, but knowledge and insight is scarce. To gain insight we need information, but also the attention and focus to process that information into knowledge. What will differentiate success from failure, and contributors from consumers, will be our ability to focus and control our attention. How will we think deeply and get our work done when there is so much distraction out there?

By the way, this is not really new. Socrates and Aristotle debated the nature of akrasia – the tendency to do things against our interests. We have always had distraction in our lives. When we have to do hard work, we try to weasel out. What has changed is the medium. Maybe for our grandparents it was reading a trashy novel. Maybe for our parents it was radio or TV. Today the new medium is interactive technology, but we’re not hopeless to fight it.

workbook horizontal CTA image rounded corners

Q: I deactivated my Facebook account this week, but I need to post to social media as part of my job. Any suggestions for managers to disrupt negative employee habits regarding social media? Are there any companies that impose these protocols?

NE: So your question is, “as a social media manager, how do I avoid social media?” (Nir and I laugh.) I think the deeper concern is not how to eradicate it altogether, but how to prevent it from creeping into areas of our life where it doesn’t belong.

There are all kinds of things you can do to address that. Speak to your employer about your company culture to see what’s expected. Are you expected to be at the company’s beck and call 24/7? If so you need to know that and maybe you’re not okay with it. Many employers will provide some guidance as to expectations of performance when managing social media and email over a specific time period each day.

Employees should discover how much time they are expected to do what Cal Newport calls Deep Work — that’s thinking and producing as opposed to reacting. You cannot tweet repeatedly while you are writing an article or working in your reflection time.

The other question that I think is more severe is, “what if I don’t like social media at all?” In that case, I don’t see a difference from picking any sort of profession that doesn’t comply with your preferences. For example, I like nature but I don’t like working outdoors all day. Being a forest ranger would not be a good career for me.

We should ask ourselves what suits our temperament. Just because something is a hot field, you don’t have to necessarily work in that field. And if you do like it, then keep it in your professional life and don’t let it bleed over into other segments like your personal time.

Q: In Hooked you talk about variable rewards. With 15Five, we have two different groups using the product; employees who fill out weekly reports (reporters) and managers who review these reports (reviewers). Both groups have different needs. Reporters want to feel seen or heard or have their frustrations answered. Some naturally supportive managers are intrinsically driven to respond to employee feedback, but others are only looking for a status update from employees that they can then send up the ladder to their bosses. Their reward ends when they review the report, but they may not realize that if they don’t also respond to employees, those employees won’t fill out their reports again next week. How do we drive engagement for reviewers, when they can’t see the distant problem of reporter disengagement?

NE: I think the problem is that people aren’t passing through their hooks quickly enough. That has to happen very fast. The Facebook, Instagram, and Twitter hooks happen every time you interact with the product. For the manager, the trigger is the email that an employee has filled out a 15Five. The action is to open the app, and the variable reward is the curiosity around what someone wrote. But what is the investment? What do I do before I close that app that closes the loop to store value and load the next trigger?

It sounds like what you want the manager to do is to say, “Thanks for the feedback Bill! I’m working on it.” But they’re not. If any single behavior is not occurring, I like to refer people to BJ Fogg’s behavior model which says B=MAT. Every Behavior is the result of Motivation, Ability and a Trigger. You’re probably not going to be able to work on motivation too much, unless you can tell that manager’s boss that they need to boost user motivation by telling people to review and respond to every report or something bad will happen. That’s a more difficult and punitive path.

Instead, I would look at ability and triggers. These are fully in your control. Unlike motivation, you can actually change ability and triggers inside your product. Make the trigger visible (so that the user is cognizant) and make it extremely easy for the user to take a desired behavior. I’m guessing that when Jennifer the manager receives the notification that Bill the employee has a frustration, they get a big open form field to reply. There’s no pre-filled reply option. In Jennifer’s head she is looking at a big open form field and doesn’t know what to write.

That may seem trivial to you, but every time the user has to think, they’re taking on cognitive load. The rule around forming habits is to reduce cognitive load to make doing easier than thinking. So give Jennifer a few options to make it easier for her to execute the intended behavior. If there is a note there that says, “Thank Bill with one of these three written responses”, what would that do to her ability to take the intended behavior?

I bet you Jennifer is sufficiently motivated. She knows that Bill deserves a thanks for sending it to her, but it takes her some time to craft that response. Seems trivial, but you must use technology to save the user from doing every bit of thinking you can. Send her three choices from a pool of a thousand canned responses, and I guarantee that you’ll see an increase in reply-rate.

Q: You recently wrote about software companies removing dashboards to decrease cognitive load. Any predictions for future UI’s that will improve customer experience?

Brain with lots of neurological activityNE: The Conversational User Interface (CUI) is very interesting. I am eagerly anticipating more of these in the next few years. Some people call them bots, assistants, or conversational commerce. I call it a CUI, which is simply taking complex information and filtering it through a text message. So whether I am communicating via the bot or human being, I simplify the information by making it look like a chat. The beauty is that anybody including my tech-phobic mom can use a chat, because she know SMS.

We’ll see more companies using this interface to expand the use of complex technologies. For example, I see the dashboard as dying. There are certainly shots already being fired, and I can’t wait to see it gone. When most people see charts and graphs and data, they freak out and their brain turns off. The result is that they don’t want to use the app.

That’s where the CUI really comes in handy. We see massive shifts in user behavior when a technology comes along and takes something that the geeks think is easy to do, and everyone else finds impossible to do. That tech moves the ability curve down by orders of magnitude so more people can use it.

Think about the iPad. My mother never touched a computer before the iPad because it was too difficult to use. Suddenly there’s a touch interface with cute little apps and now we can’t take her off it.

We’ll also see something similar with data. Today there’s so much promise around data but the interface sucks. It’s like a C-prompt interface as opposed to a graphical interface. We need that layer of abstraction to make the technology usable, and that’s what we’re going to see with a CUI.

Instead of me looking at Google Analytics and wondering what to do about a spike in traffic, there should be a conversational UI that says, “Hey! somebody posted your article on Reddit. There’s a spike in traffic and here’s something you can do about it. Can I help you get that done?” You won’t have to look at the data, the data will reach out to you when something can be done to meet your goals.

For now, I still want the CUI to show me the graph and make suggestions with a yes or no. If we can get there as a first step, we know where this is going. It’s turning into a conversation over text and eventually over voice.

Q: It’s clear at this point that humanity is co-evolving with technology. The sci-fi geek in me imagines a world not unlike The Matrix. Any predictions, either positive or negative, about the future of techno-human relationships? (Nir agrees to answer on the condition that I do too. I went first.)

David’s Answer: I have a pretty negative prediction about the future. We will see groups emerge like neo-luddites who are vehemently opposed to technology, and they will be attempting to work us backwards to the analog age. People will become more disconnected from each other as a result of tech.

We will make choices more on the negative side of humanity. For example, we have the ability to alter the genetic makeup of children in a test tube. Why are we choosing to do that? It’s not just to prevent disease and deformity, people want their kids to look a certain way!

We have ancient wisdom about ways to live a life that is more balanced. Science ends up proving these systems after the fact. For example, we know that family trauma gets into our DNA. But in general as a society, we don’t value learning non-violent communication, or optimizing our lives for having healthy relationships over accumulating wealth. It seems to me that while information and knowledge lies ahead of us and is made more easily accessible by technology, all wisdom is already behind us.

Nir’s rebutal: 200,000 years ago when the first caveman invented the wheel, there was a guy next to him that said, “Ug! Wheel bad! You ruin our way of life!” That has always happened. I don’t think of our humanity having anything to do with what to me appears to be superficialities.

The fact that people make babies in test tubes has nothing to do with our fundamental humanity. Two days ago I visited a friend of mine who’s been married for over a decade and he is going to have a surrogate child via an egg donor and the sperm of his partner. These are great people who are going to raise a great human being. We are so lucky to live in this age where that can happen, and we need to be cognizant of our technophobia.

With the perspective of history, it’s almost impossible to stay technophobic. The quality of life that we have today due in whole measure to technology, helps even the most financially struggling person live better than the most exalted kings of the past. It would be ridiculous to think that life was better in any other age in human history than right this second, and particularly in America. We have things better than ever and I don’t see that trend reversing.

I think there are risks. We will have bad things happen that are worse than before. But if you look at the long game, tech helps us live better lives. What scares people when it comes to these entertainment technologies, is that somehow we should be doing something else more in line with our values. Lots of things appear to be distractions ala plugging into The Matrix.

Distractions are nothing new, just look at spectator sports. These are a form of entertainment that have existed since ancient Greece. How many hours today do people sit with their eyeballs glued to a flickering box showing a ball going back and forth between hoops or goals?

Take a step back and it’s meaningless, and yet there is something in the human condition that makes us desire distraction. Some people find that distraction in entertainment, religion, politics, or (unfortunately) drugs. It has been with us and will be with us forever. All that’s changed is the medium. What’s interesting is that people have this gut reaction to think that other people’s distractions are stupid, wasteful, and frivolous while justifying their own.

Without these things we are just a blue dot in a huge black vacuum of meaningless nothing. So we need them. That’s what makes us human. Because that same drive to pursue meaning is what helps us cure disease and help our fellow human and make the world a better place. It’s all based on our values that drive us to do the things we do.

The core human need to improve things is at the core of our humanity, not whether we make babies a certain way, eat a certain type of food, or what game we choose to play.

You may also enjoy reading: