How "dark schemes" work
Ideally, designers should try to make user interaction with the interface as pleasant as possible. But sometimes we stumble on something that doesn't work in our favor. For example, we notice that subscribing to a service is much easier than unsubscribing from it.
UX design expert Harry Brignall has coined the concept of "dark schemes" for such techniques. With their help, interfaces unobtrusively force the user to do something he didn't plan to do, or prevent behavior that is not beneficial to the company.
Let's say you want to unsubscribe from the mailing list. After scrolling to the end of the email and making some effort, you find the unsubscribe button. It's small, pale, and hidden at the very bottom, under several paragraphs of text. This is a clear sign that the company is putting obstacles in your way to unsubscribe. But the button offering to buy something at a discount is usually large, bright, and at the very top.
An example from the newsletter. Click to enlarge and try to find the unsubscribe link
Or another example. Signing up for a monthly subscription to most services is effortless, which is not the case with unsubscribing. Sometimes the method of customer retention is unobtrusive: a bright button that says "No, I want to stay" and a less prominent one that says "Yes, I really want to unsubscribe."
How web design affects us: an example of an unsubscribe page
An example of an unsubscribe page
It seems like a small thing. Most people will know where to click. But even if only a few people are inattentive and accidentally unsubscribe, the company will make money.
Many companies make it difficult for customers to leave. Eventually those will leave anyway, but if they stick around an extra 10 or 20 percent of the time, their accounts will live a little longer. When there are hundreds or thousands of such clients, the bottom line is a huge amount of money, and that's from those who are going to opt out anyway.
In other situations, the barriers to unprofitable company action are more serious. For example, if you want to delete your account on Amazon, you can not do it yourself - you have to contact the company and ask their employees to do it. And on the deletion instructions page, you will see a list of what to do: Close Your Account / Amazon.com reasons to abandon your idea.
If you do intend to act, you'll need to fill out a special form. You will then receive an email explaining once again why you should not delete your account. If you're absolutely sure about this, you can click the link at the end of this long email. It will take you to a page where you will need to send another request to Amazon, confirming that you really do want to remove your account.
Brignall calls these schemes a mousetrap: It's easy to get in, but much harder to get out. Far from always being introduced intentionally. It's just that it usually takes a lot of effort to make it easier for a user to sign up, while the process of closing accounts is not on the developers' list of priorities.
But in cases like Amazon's, the creators of interfaces specifically try to make the opt-out mechanism more complicated, because from the company's point of view, it shouldn't be easy. Of course, you could say that Amazon doesn't want users to carelessly delete their accounts, so they make it complicated, i.e. they take care of people. But it also benefits the company itself when customers get so tired of trying to delete that they keep the account.
Brignull identified many more types of dark patterns / Harry Brignull similar "dark schemes. For example, "shopping cart sneaks," in which a store slips something into your order while you're buying another product. It could be a warranty or a data plan you don't need, and you have to manually remove it from your shopping list.
You've also probably encountered the "Guilty Consent" scheme, where they try to make you feel uncomfortable, so that you agree to some option or don't unsubscribe from the mailing list. For example, they show you a picture of a sad puppy or a full screen banner asking you to sign up for the newsletter with only two options: "OK" and "No, I hate reading interesting stories."
What users should do
The bad news: in companies, whole teams are busy inventing and testing such techniques, and you have to rely only on yourself.
The good news is that you can take advantage of a powerful tool: knowledge. If you know about cognitive distortions and the tricks that services use to manipulate your behavior, you'll find it easier to resist.
If you notice a "dark scheme," tell it publicly. Complicating the unsubscribe process may help the company make money, but if it's convicted of misleading customers, it's likely to try to change the design.
Not all "dark schemes" are built into websites intentionally. Sometimes the designer does not even realize that his interface manipulates the user, and many just apply what works. And not every attempt to influence our behavior hurts us.
Nevertheless, it's always important to remember that design can change user decisions and that the company's goals are not necessarily aligned with yours. This way you protect yourself