Wednesday , September 18 2019
Home / Naked Capitalism / Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Summary:
By Lambert Strether of Corrente We have called attention to the use of “dark patterns” on websites before (2013), but until now the literature has been confined to definitions, examples, and classifications from website developers and User Interface/User Experience designers. Now we have a full-fledged academic study (“Dark Patterns at Scale“) that quantifies the number of sites that use dark patterns, and puts some rigor into the definitions and example. I’m going to start with the original literature, where I’ll define terms, then summarize the study, and then briefly look at legislation introduced in the Senate (the “DETOUR” Act) to which the new study should lend support. The term “dark pattern” was coined by independent user experience consultant in Brignull 2010; he set up the “Dark

Topics:
Lambert Strether considers the following as important: , , ,

This could be interesting, too:

Lambert Strether writes Links 9/17/19

Yves Smith writes FAA Hoist on Its Own Boeing 737 Max Petard: Multiagency Panel to Issue Report Criticizing Agency Approval Process, Call for Certification Changes

Yves Smith writes Brexit: A Repeat of the Turmoil of 1914-1922?

Yves Smith writes Drama in the Oil Markets After Attacks on Saudi Production. But This Isn’t 2007 Anymore

By Lambert Strether of Corrente

We have called attention to the use of “dark patterns” on websites before (2013), but until now the literature has been confined to definitions, examples, and classifications from website developers and User Interface/User Experience designers. Now we have a full-fledged academic study (“Dark Patterns at Scale“) that quantifies the number of sites that use dark patterns, and puts some rigor into the definitions and example. I’m going to start with the original literature, where I’ll define terms, then summarize the study, and then briefly look at legislation introduced in the Senate (the “DETOUR” Act) to which the new study should lend support.

The term “dark pattern” was coined by independent user experience consultant in Brignull 2010; he set up the “Dark Patterns” website to present a classification and present examples. His definition:

Dark Patterns are tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.[1]

As a sidebar, the EIT Lab (“European IT Law by Design”, at the University of Louvain) gives a somewhat more evolved definition:

Dark patterns are choice architectures used by many websites and apps to maliciously nudge users towards a decision that they would not have made if properly informed. Such deceptive that exploit individuals’ heuristics and cognitive biases (e.g. interfaces designed to hide costs until the very end of the transaction, services and products added to the consumer’s basket by default, etc.) are already well-known and sanctioned in the consumer protection domain. Regrettably, dark patterns have become widespread also in the area of data protection, where users are de facto forced to accept intrusive privacy settings, because of the way the information is framed and privacy choices presented by website’s operators. When users are tricked by design not only they can be harmed but also their trust in the digital market is likely to be affected. Dark patterns are therefore posing an additional challenge to data protection law, that needs to be addressed in an interdisciplinary way.

I like the introduction of “choice architectures,” because that would make dark patterns a sort of bent nudge theory, assuming nudge theory is not itself bent.) End sidebar.

In 2011, Brugnell expanded on his ideas at the A List Apart webzine, writing:

Let’s continue a while as evil web designers: perhaps you’ve never thought about it before but all of the guidelines, principles, and methods that ethical designers use to design usable websites can be easily subverted to benefit business owners at the expense of users. It’s actually quite simple to take our understanding of human psychology and flip it over to the dark side. Let’s look at some examples:

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Take the second example, “People stick to the defaults.” (Brugnell calls this a psychological insight, but I think the term of art is “cognitive bias.”) As a Mac user I was long ago trained — because the Apple Human Interface Guidelines were ethical — to go into any new software and adjust the user preferences because software works for me, not the other way round. However, if I were not a suspicious old codger, I wouldn’t have gone into the settings of my horrid Android phone (sorry), and turned off of the default stupidity and exploitation as I possibly could. (Sadly, in order to do that, I had to give the phone company an email account — a burner, naturally — which a fine example of the “forced enrollment” dark pattern, identified in the “Dark Patterns at Scale” study).

Importantly, firms don’t create dark patterns out of the sheer desire to be evil — leaving Facebook and Uber aside, of course — but for profit. Brugnell goes on:

[Business] can become accustomed to the resultant revenue, and unlikely to want to turn the tap off. Nobody wants to be the manager who caused profits to drop overnight because of the “improvements” they made to the website.

An example of a dark pattern (Brugnell calls this “Friend Spam“) at the user interface level comes from LinkedIn. Here’s the tricky screen:

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Tricked user Dan Schlosser explains:

LinkedIn asks you to “Get started by adding your email address.” There is a note explaining what this button does, but because it is put in light gray text next to a bold blue “Continue” button, they get most people to blindly click ahead. This is definitely a dark pattern. In fact, it’s really a lie. This page is not for “adding your email address,” it’s for linking address books.

BWA-HA-HA-HA! You think you’re just signing up, but you’re giving LinkedIn your entire address book! (And constructing LinkedIn’s equivalent of Facebook’s social graph. To be fair, LinkedIn was fined for this and dropped the practice).

But dark patterns can also be devised at a level far above buttons in the user interface, at the site level, or even at the level of the internet at a whole, as TurboTax shows. From Pro Publica, “Here’s How TurboTax Just Tricked You Into Paying to File Your Taxes“, the background:

Intuit and other tax software companies have spent millions lobbying to make sure that the IRS doesn’t offer its own tax preparation and filing service. In exchange, the companies have entered into an agreement with the IRS to offer a “Free File” product to most Americans — but good luck finding it.

It’s a long article, because it details the really extravagant lengths that TurboTax went to, to stick their hand into your pocket. They made it impossible to get to the real free site from the paid site.They invented a fake “free” site that was confusingly similar to the real free site, but was not free. Then they made sure that the fake site came up first in Google searches, and the real free site was buried. If you knew what you were looking for, you could Google for the real free site. Going there, this dialog would appear:

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Another Google search brought them to a page with two options: “See If You Qualify” and “Start for Free.” The “Start for Free” link brought them back to the version of TurboTax where they had to pay, but the “See If You Quality” link finally took them to the real Free File program.

BWA-HA-HA-HA! It’s GENIUS! (And an entire web development team and its managers were paid quite well to implement this crooked scheme, too. Intuit is, of course, located in Mountain View, California.)

So, with those definitions and examples under our belt, we can turn to Arunesh Mathur, Gunes Acar, Michael Friedman, Elena Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan, “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” (PDF) Proceedings of the ACM on Human-Computer Interaction. November 2019. (Here is mobile-friendly version of the text.) Here is the conclusion:

In this paper, we developed an automated techniques to study dark patterns on the web at scale. By simulating user actions on the ∼11K most popular shopping websites, we collected text and screenshots of these websites to identify their use of dark patterns. We defined and characterized these dark patterns, describing how they affect users decisions by linking our definitions to the cognitive biases leveraged by dark patterns. We found at least one instance of dark pattern on approximately 11.1% [1,254] of the [11K] examined websites. … Furthermore, we observed that dark patterns are more likely to appear on popular websites. Finally, we discovered that dark patterns are often enabled by third-party entities, of which we identify 22, two of which advertise practices that enable deceptive patterns. Based on these findings, we suggest that future work focuses on empirically evaluating the effects of dark patterns on user behavior, developing countermeasures against dark patterns so that users have a fair and transparent experience, and extending our work to discover dark patterns in other domains.

The “simulating user actions” part is really amazing and fun stuff. First, they built a crawler to detect shopping sites. Then, they curated the list of URLs the first crawler found, and fed that to a second crawler:

capable of navigating users’ primary interaction path on shopping websites: making a product purchase. Our crawler aligned closely with how an ordinary user would browse and make purchases on shopping websites: discover pages containing products on a website, add these products to the cart, and check out.

They then manually examined the primary interaction paths, curated them for dark patterns, and threw them into buckets. Table I:

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

Table 1 summarizes categories (for example, “Obstruction”), Type (“Hard to Cancel”), giving a description, quantification of sites and usages, and its place in the author’s taxonomy (whether asymmetricak, covert, deceptive, information hiding, or restrictive). Using their categories, I suppose that the LinkedIn example above would be classified as “Visual Inference,” because users would skip the grey type that explained what the prominent button really did. The TurboTax example would be “Trick Questions,” because of the deceptive language. Readers, you may test the robustness of these categories and types from your own online shopping experiences, if any.

Here is Figure 2, which shows that more popular websites are more likely to use dark patterns:

Phishing Equilibria in E-Commerce: Shopping Sites and Dark Patterns

And then, there is “an ecosystem of third-party entities” that sells dark pattern development:

We discovered a total of 22 third-party entities, embedded in 1,066 of the 11K shopping websites in our data set, and in 7,769 of the Alexa top million websites…. Many of the third-parties advertised practices that appeared to be—and sometimes unambiguously were—manipulative: “[p]lay upon [customers’] fear of missing out by showing shoppers which products are creating a buzz on your website” (Fresh Relevance), “[c]reate a sense of urgency to boost conversions and speed up sales cycles with Price Alert Web Push” (Insider), “[t]ake advantage of impulse purchases or encourage visitors over shipping thresholds” (Qubit)…..

In some instances, we found that third parties openly advertised the deceptive capabilities of their products. For example, Boost dedicated a web page—titled “Fake it till you make it”—to describing how it could help create fake orders. Woocommerce Notification—a Woocommerce[2] platform plugin—also advertised that it could create fake social proof messages: “[t]he plugin will create fake orders of the selected products”

Finally, note that the problem may be even worse than the authors describe, because the scope of the study is limited:

1,818 represents a lower bound on the number of dark patterns on these websites, since our automated approach only examined text-based user interfaces on a sample of products pages per website.

In other words, if the dark pattern was implemented as a graphic, their crawlers wouldn’t catch it. Also, the study would not catch dark patterns above the UI/UX level, as with TurboTax creating an entire fake site, and then gaming Google search to point to it.

Finally, let’s turn to S1084, “Deceptive Experiences To Online Users Reduction Act” (DETOUR Act). Introduced by Senators Mark Warner (D-VA) and Deb Fischer (R-NE), the original co-sponsor, it sadly has gained no additional co-sponsors. (Senator Josh Hawley (R-MO) is also toiling in the same vineyard with the Social Media Addiction Reduction Technology (SMART) Act.) Warner’s bill has a behavioral or psychological research component, but here is the section relevant to dark patterns:

SEC. 3. UNFAIR AND DECEPTIVE ACTS AND PRACTICES RELATING TO THE MANIPULATION OF USER INTERFACES.

(a) Conduct Prohibited.—

(1) IN GENERAL.—It shall be unlawful for any large online operator—

(A) to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data;

(B) to subdivide or segment consumers of online services into groups for the purposes of behavioral or psychological experiments or studies, except with the informed consent of each user involved; or

(C) to design, modify, or manipulate a user interface on a website or online service, or portion thereof, that is directed to an individual under the age of 13, with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user.

Gizmodo describes the bill as follows:

According to ZDNet, practices that could be targeted under the bill include suddenly interrupting tasks unless users hit consent buttons, setting “agree” as the default option for privacy settings, and creating convoluted procedures for users to opt-out of data collection or barring access “until the user agrees to certain terms.”

In other words, this would radically change how a handful of massive companies whose entire business model relies on monetizing troves of user data operate.

“Any privacy policy involving consent is weakened by the presence of dark patterns. These manipulative user interfaces intentionally limit understanding and undermine consumer choice. Misleading prompts to just click the ‘OK’ button can often transfer your contacts, messages, browsing activity, photos, or location information without you even realizing it,” Fischer added. “Our bipartisan legislation seeks to curb the use of these dishonest interfaces and increase trust online.”

As we have seen, Fischer is quite correct. The industry critique I have found focuses, tellingly, not on the patterns themselves, but on the definition of “large online operator”. Will Rinehart of the American Action Forum:

While the Act’s goals are laudable, it suffers from a crippling fault: ambiguity. Its broad language would give the Federal Trade Commission (FTC) legal space to second guess every design decision by online companies, and, in the most expansive possible reading, it would make nearly all large web sites presumptively illegal.

You say “make nearly all large web sites presumptively illegal” like that’s a bad thing!

The Act defines large online operators as any online service with more than 100 million “authenticated users of an online service in any 30 day period” that is also subject to the jurisdiction of the FTC. Conspicuously, the Act doesn’t define what constitutes an “authenticated user,” which is important for understanding its scope. If authenticated user means that a site must have user profiles, then Google wouldn’t be included at all because it doesn’t require users to create a profile. Furthermore, if these 100 million users all had to be in the United States, then only Facebook, Instagram, and Facebook Messenger would be regulated because only these sites hit the threshold.

So strike out “authenticated.” Or are only authenticated users entitles to avoid trickery? The critique also urges that (tl;dr) “manipulation of user interfaces” is good, actually, and everybody does it, but I would urge that legislators crafting the bill simply talk to Mathur, Acar, Friedman, Lucherini, Mayer, Chetty, and Narayanan to find out what businesses do and do not do, and what is deceptive and what is not; in essence, Rinehart’s idea is that consumer fraud is indistinguishable from normal business practice in the digital realm. I don’t think the country has yet fallen that low.

* * *

Oh, the phishing equilibibrium part: I don’t want to become tedious by constantly hammering Akerloff and Shiller’s long definition[3], so I’ll use my “on a postcard” version: “If fraud can happen, it will already have happened.” It’s hard to think of a better example of this principle than dark patterns. The fact that there’s an entire ecosystem of third parties that supports scamming the users makes the entire situation all the more disgusting.

NOTES

[1] Akerlof and Shiller give an operational definitition of the outcome of a phishing equilibrium for users: “[M]aking decisions that NO ONE COULD POSSIBLY WANT” (caps in original); for example, retaining your subscription when you meant to cancel it.

[2] One might wonder if Adobe’s hilarious “Woo Woo” video is a sly insider reference:

[3] From NC, “Angling for Dollars: A Review of Akerlof and Shiller’s Phishing for Phools

The fundamental concept of economics is … the notion of market equilibrium. For our explanation, we adapt the example of the checkout lane at the supermarket. When we arrive at the checkout at the supermarket, it usually takes at least a moment to decide which lane to choose. This decision entails some difficulty because the lines are — as an equilibrium — of almost the same length. This equilibrium occurs for the simple and natural reason that the arrivals at the checkout are sequentiallly choosing the shortest line.

The principle of equilibrium, which we see in the checkout lanes, applies to the economy much more generally. As businesspeople choose what line of business to undertake — as well as where they expand, or contract, their existing business — they (like customers approaching checkout) pick off the best opportunities. This too creates an equilibrium. Any opportunities for unusual profits are quickly taken off the table, leading to a situation where such opportunities are hard to find. This principle, with the concept of equilibrium it entails, lies at the heart of economics.

The principle also applies to phishing for phools. That means that if we have some weakness or other — some way in which we can be phished for fools for more than the usual profit — in the phishing equilibrium someone will take advantage of it[2]. Among all those business persons figuratively arriving at the checkout counter, looking around, and deciding where to spend their investment dollars, some will look to see if there are unusual profits from phishing us for phools. And if they see such an opportunity for profit, that will (again figuratively) be the “checkout lane” they choose.

And economies will have a “phishing equilibrium,” in which every chance for profit more than the ordinary will be taken up.

“Every” really meaning every. To put the idea in simpler terms with a more limited use case: “If fraud can happen, it will already have happened.”

Leave a Reply

Your email address will not be published. Required fields are marked *