How Nir Eyal’s habit books are dangerous

Blaming people for the weaknesses we exploit

Hired as a speaker throughout Silicon Valley and the international tech world, Nir Eyal’s appeal and influence cannot be ignored. He literally wrote the book that has helped many startups and corporations exploit human weaknesses for profit. He’d just rather they have good intentions.

Using many of the same concepts found in game mechanics and gambling, Nir Eyal’s teachings assist the creation of technology that uses variable rewards systems to make people… not necessarily smarter, happier, or healthier… but primarily: hooked. To his credit, Eyal does not hide behind a euphemism here. In a many dictionaries the example sentence for hooked references narcotics.

hooked. adjective.

informal: Addicted. “A girl who got hooked on cocaine.”
Devoted to or absorbed in something.
— (lexico)

It’s not hard to imagine how ”addiction” springs to mind when discussing the habit-forming techniques outlined in the book. When thinking about video games and casinos, though, remember how playing the game often is the goal itself. Choosing to participate is part of the deal, and for added protection there is often a good deal of regulation around the latter.

Conversely, in the many “free” online communities we devote our time to in the 21st century, few people are aware that they are entering a veritable casino — a place designed specifically to keep them staying longer and paying attention to targeted messages. Now, some people will enjoy themselves thoroughly, accepting the game, some people will not be as hooked as others and others still will feel like their time is stolen from them. People with such a variety of experiences may not understand each other’s reactions, and may argue over whose experience is more valid. Arguments that feed back into the system.

Whether you appreciate them or not, book 1 argued for putting those habits in place. ”Like us, or don’t like us? Discuss. Here is a text area for you. Just make sure you discuss it on our platform.”

Book 1 giving you a headache? Take book 2 and call me in the morning.

In a second book Nir Eyal now wishes to help people distance themselves from the distractions they are hooked on, and become better at taking control over their own life.

Let’s just take a moment to appreciate the irony of how book 2 is written to help people get out of the habits that book 1 taught the tech companies to impose on them. The pitch to the publisher is easy enough to imagine:

“Are you sure there’s a market for this book, Nir?”

“Am I sure there’s are people out there with habits they don’t want? Sure, I helped put them there!”

Cennydd Bowles has a great analogy for this: Poachers turned gamekeepers.

But what about the good habits?

Time and time again, I see the argument that getting people hooked is good if the outcome is positive. Like exercising, like eating healthier, like taking steps to improve the environment. Rarely do I see these arguments address the fact that there are billion-dollar industries focusing minute-by-minute on making us do the exact opposite of positive. And I have a decent idea about which books they read.

The idea that we create change by deviously influencing people to create so-called positive habits, instead of educating them about the many actors already influencing their habits, is a maddening concept for me. If “winning“ relies on getting people subconsciously hooked on the “right stuff”, then we’ve already lost.

I’ll give you this, there are for example exercise apps that help people make changes in their lives. Ones that employ nudging. But once again, these apps were chosen and asked to do just this. I purposely downloaded the app and asked it to nudge me in a certain direction. It’s as close to consent as many apps come in this industry at the moment.

Please note how this differs immensely from downloading an app that is covertly pushing me in a thousand different directions I did not ask for, have not consented to and am likely only vaguely aware of.

And please forgive me, because to make the above point I had to leave out how most health apps are based on bogus science. You’ll notice how I didn’t say the habits they create are necessarily positive.

Nir Eyal and ethics

I of course do not believe Nir Eyal has any evil intent. He says himself he has the exact opposite intent. He wants people to build good, healthy products and use his teachings to attract and keep a user base and change people’s behavior for the better. To reason around good intent, Eyal has even coined the phrase “the morality of manipulation”.

This comes very close to saying that removing people’s autonomy, regardless of the will of the person, is fine as long as you’ve decided that the behavior you are promoting is for the greater good. Is this an unfair interpretation? Please read on.

What I do believe is that Eyal is really good at popularising certain ideas and concepts on human behaviour that others can readily make use of for their own gains. But a focus on simplifying human behavior can lead to some really dangerous assumptions. Some of them strikingly naive.

I am far from the first to criticise and point out how Eyal’s teachings can be used for human exploitation. But the consequent thinking about consciously managing negative impact has not reached the level of maturity one might hope.

In February 2018 Nir Eyal was interviewed for Journalism + Design. One question was: “How can companies build products that are persuasive but not coercive?”

This is Eyal’s response:

There are two questions I tell people to ask themselves:

1. Do you believe the product you’re working on materially improves people’s lives?

2. You have to see yourself as the user. In drug dealing, the first rule is never get high on your own supply. I want people to break that rule and get high on their own supply, because if there are any deleterious effects, you will know about them.

There’s a simple market incentive to not build products to screw people. We’re not automatons, we’re not manipulatable puppets on strings. If a product hurts people, they’ll stop using it.

If I last year had said anything remotely similar to this I would have been ousted by the design industry.

I am distressed by how someone can respond to a question of ethics and not mention anything on actually speaking to other people, involving people who are regular victims of prejudice and mistreatment, or even maybe regularly listening for harm. No, the 2-step program asks of you to believe you will do good and try the product yourself, and you’re good to go. The powers of the market will protect people from harm.

Also, by his own account, Eyal has worked with the tech industry for years, even early on involved in building apps. He has taught a course at the Stanford Graduate School of Business on product design. And yet this simple truth is not expressed: You as the maker do not pretend to be a user. That’s a recipe for disaster, destined to miss important issues with usability, accessibility, diversity, exclusion and impact. Only people not involved in product development will uncover what you are too biased to see. Stanford’s own design school includes in its core abilities ”Learn from Others (people and contexts)”. Thankfully ”Pretend to be the User” is not one of them.

Honestly, what saddens me the most in all of these conversations is this lack of recognition that the people who are being harmed the most are the ones always sidestepped by society, with no voice and no platform to object or stand up for themselves. Not always marginalised but almost always edge cases.

And herein lies the danger, and why I believe Eyal’s books cause significant ethical issues. While providing tools to start experimenting with habit-forming techniques there has to be some reflective reasoning around the potential for negative impact. Something beyond ”your intent and the powers of the market will keep everyone safe”.

But the second book, will it not release people from the powers of habit and make the dangers of the first book obsolete? Well the premise of the second book seems to be exactly this: don’t fight the companies, they are only doing their job, fight your own lack of discipline. Apply yourself and set yourself free from those investing billions to influence your choices.

The takeaway here seems to be that if you’re having problems, the problem is you. I believe it’s fair to point out the intrinsic dangers in this type of assertion. I am a strong believer in that many people can become better at controlling their own emotional reactions to the world around them, but it is precisely because this is a very hard thing to do that we as humans are susceptible to external influence. The very reason why much of the advice in book 1 really works.

To be clear, it is not always the teachings themselves I object to but the lack of guidance around them, or when an illusion of guidance actually exacerbates potential harm. Such as getting high on your own supply.

Guidance would be things like:

  • Design to change habits is a super-interesting concept, but how about making sure we design with people, with their consent, and not at them.
  • There is no sure-fire way to change habits (individual traits, external factors and context matters) which is why we have to be really careful about listening to outcomes so we’re not pushing people in a detrimental direction. What I’m saying here is that even with positive intent (as Eyal tells you to abide to) there are obviously dangers.
  • People can indeed be empowered to better manage their relationships with digital services, but the ability to do so is also influenced by education, power of voice, social support, time, health and resilience. Regulation can help protect those devoid of power to influence the market. Consumer rights are not governed by waiting for market forces to rectify themselves.
  • That some people have a healthy and joyous relationship with a digital service does not mean that the same service is not also causing a negative impact for a significant number of other people. These two things can be true at the same time, especially with companies dependent on maintaining a massive user-base for their survival.
  • Impact also applies to people who are excluded. There is something to be said for bridging gaps rather than widening them.

I’m positive you can help me think of more guidance to avoid hurting people with these tools.

Surely, people will see through any wrongdoing?

There are also inconsistencies in the messaging around habit-shaping. Arguments like this pop up:

So in this day and age, if you screw people over, if you make a product people regret using, guess what? Not only are they going to stop doing business with you, they’ll tell all their friends to stop doing business with you. — Nir Eyal

Seemingly never having heard of fake reviews, sometimes Eyal does indeed seem to be saying that the customers, consumers and citizens will prevail in this struggle, because they will see companies for what they really are, and spread the word. But he is simultaneously arguing for companies to create habit-forming products, ones that people use without thinking. Since Eyal likes to allude to narcotics, I’m keen on understanding how a corporate focus on making people unconsciously hooked aligns with empowering people to use less of something, and incentive to tell their friends to stop using it.

I would say the argument that people will just stop using bad products is a dishonest one. Especially for a behavioral scientist. Today we know so much more about how emotions and fear control human behavior. The concept of people as rational creatures is a dwindling one – emphasised by Daniel Kahneman and Richard Thaler’s respective Nobel Memorial Prizes in economics. Consider, for example, how much more we understand today about why people stay in abusive relationships. You don’t have to be addicted to be impelled down a dangerous path by habits.

Of course, it is not likely that Nir Eyal himself truly believes that people will simply leave bad relationships with products. He is otherwise quoted as saying, more to his point: “The best products do not necessarily win.” Really, the point of his teachings have been to create “a monopoly of the mind” and make products “that people turn to with little conscious thought”.

I can’t remember the last time someone solicited my subconscious to make me spend time with them, and I decided to leave them and tell all my friends, I’m no behavioral scientist, but I’m going to presume this is because the whole point was for me to never be conscious of it happening. And to my point, even if I enjoyed the interaction it does not necessarily follow that the manoeuvre was ethical.

Few people can, with a straight face, assert that people will just leave companies that apply bad tactics, and then write a whole book on the premise that people will not just leave companies that demote their well-being.

But Eyal just did. In fact, it would seem people now need Eyal’s second book to exercise forethought and discipline for managing these relationships with products. Obviously, the companies that underpin this need are in no way culprits here. I mean, all they did was follow advice from the first book. Good on them.

Rounding off, I really want to emphasise that I still believe Nir Eyal when he says he has positive intent. But sometimes even positive intent backfires and we all – including myself – need to proactively take responsibility, and recognize accountability, for the ideas and knowledge we share. Minimising negative impact is also about seeing and responding to the many ways people can take your work and misuse it. I hope to highlight dangers to boost awareness. In the end, I much prefer conscious decision-making over subconscious when impacting both others’ and my own well-being.

Sources and further reading:

Per Axbom

Per Axbom


Per Axbom is a Swedish communication theorist born in Liberia. For two decades he has educated digital professionals and helped organizations with digital usability and accessibility. Per makes tech safe and compassionate through reflective reasoning, human-considerate design, coaching and teaching. You can hear his voice on UX Podcast.

Digital compassion book cover Per's recent handbook on managing ethics in tech, Digital compassion, is available to order from Amazon in Kindle format. Send an e-mail to Per for more options.

Schedule time with Axbom

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

newest oldest most voted
Notify of

[…] part of the feedback and reactions to my recent post on the Dangers of Nir Eyal’s books, I received a relevant question. It relates to how we as designers architect the choices that are […]