Navigating the Shadows of Online Advertising: Dark Patterns, Machine Learning, and Privacy Risks

Madison Gambone, Contributing Member 2023-2024

Intellectual Property and Computer Law Journal

           

Introduction

Have you ever felt like your phone was playing a trick on you? Maybe you felt confused when your Amazon page refreshed, and the “limited-time offer!” countdown restarted. Or maybe you were upset when you got a notification that Joe shared a Snapchat with you but opened the app to find out he had only posted to his story. While deceptive marketing practices may feel like a normal part of life, the consequences are far beyond ordinary. Today, companies use machine learning to amplify deceptive advertisements to gain access to privacy data and manipulate consumers’ choices. This blog post will walk through an introduction to dark patterns and the role of machine learning in deceptive advertisements, examine current regulations and pending litigation, and discuss the impact that litigation may have on dark patterns and data privacy law.

AI technologies will continue to progress rapidly, and consumer data collection, use, and sale will generate unimaginable profits. Without regulation, dark patterns will spiral out of control with machine learning allowing companies to cause irreparable harm to our privacy. The Federal Trade Commission must enforce strict compliance on deceptive design elements and prevent businesses from further violating consumers’ free choice in purchasing and privacy.

Background

STOP!! CLICK HERE BEFORE CONTINUING!![1]

Dark patterns are sophisticated design practices and psychological tactics that manipulate consumers into making choices they would not have otherwise made.[2]  Dark patterns frequently appear as disguised advertisements or manipulated interfaces.[3] For example, dark patterns may be seen  as countdown timers, limited quantity alerts that depict a fake scarcity, hard-to-cancel subscription services, or elements that result in a forced action like bundled consent agreements.[4] Deceptive practices prevent consumers from making fully informed decisions by taking advantage of their cognitive biases to control their actions or delay access to information.[5] Companies benefit from using dark patterns by earning money through unintended purchases, recurring subscription charges, and collecting massive amounts of information without consent.[6]

Using dark patterns to collect information presents a critical danger to consumer privacy. Dark patterns can be layered and combined with machine learning to leverage or target a specific individuals or a groups’ interests, traits, behavioral patterns, and demographics. [7] This information is then used to create ads microtargeted to those people or groups. [8] Companies also use machine learning[9] for predictive analytics, presenting advertisements a user will be more likely to interact with. [10]

A well-known user of these tactics is Facebook (now known as Meta).[11] So well-known that there is a type of forced action dark pattern named “Privacy Zuckering,” named after Facebook CEO, Mark Zuckerberg.[12] Privacy Zuckering occurs when businesses create deliberately confusing jargon and user interfaces to trick consumers into sharing more information than intended.[13]

Imagine you downloaded a popular messaging app that required you to enter your phone number. After you entered it, a notification popped up saying, “Text anyone in your phone,” with two options beneath it. One option says “OK,” and the other says “Learn More.” At the bottom of your screen is text with a hyperlink that says, “Syncing your contacts helps friends connect too. Manage Contacts.” How many dark patterns do you notice? You might have noticed there is no option to decline, or maybe you looked closely and saw the small gray text embedded in the pop-up that said, “App will continuously upload your contacts to connect you with friends.” However, unless you clicked “Learn More,” you did not discover the Privacy Zuckering that enabled the app to sell your phone number and all other personal data to third parties by clicking “OK.”

The issue is not with Facebook alone; any company can employ deceptive tactics, and many do. The real concern is that companies are constantly manipulating consumers to create better deceptive tactics and influencing their purchasing and privacy decisions. While Facebook is not a unique risk to consumers, the way the FTC regulates it is influential for all other companies. The FTC must strictly enforce the rules against Facebook to prevent other companies from manipulating consumers. The FTC has recently expanded consumer protections, but the current regulations against unfair and deceptive practices are inadequate.  

Discussion

The Federal Trade Commission (“FTC”) is a regulating body that administers laws for consumer protection.[14] Section 5(a) of The Federal Trade Commission Act (“FTC Act”) declares deceptive or unfair acts or practices in the marketplace unlawful.[15] A deceptive practice involves an omission, misrepresentation, or misleading practice that deceives a consumer acting reasonably under the circumstances.[16] An unfair act or practice causes, or is likely to cause, substantial harm to consumers.[17] The injury to consumers must be material, which means it is not reasonably avoidable by consumers.[18] However, an unfair or deceptive practice is not material if the countervailing benefit to consumers or competition outweighs the injury.[19] The FTC also regulates deceptive design elements and privacy protection through the Restore Online Shoppers’ Confidence Act (“ROSCA”) and the Children’s Online Privacy Protection Act (“COPPA”).[20]

Pushing a consumer toward sharing data by illusory choice, substantial deviation from, obscuring, or subverting consumers’ privacy preferences violates the FTC Act.[21] When enforcing privacy or data collection violations, the FTC has stated the harm is serious if the consumer cannot reasonably avoid collection and use because the information is not clearly disclosed, or access to essential goods and services is conditioned on providing the information.[22] For example, if businesses automatically and discreetly collect consumers’ biometric information as they enter or move through a store they cannot avoid the collection or use of that information.[23]

In May, the FTC issued a Policy Statement warning companies about consumer privacy and data security risks due to using consumers’ biometric information[24] and related technologies, including those powered by machine learning.[25] Some technologies use biometric information to determine the characteristics of an individual, including age, gender, race, personality traits, aptitudes, or demeanor.[26] Collecting this information poses a risk to consumers because large databases of biometric information are attractive to malicious actors for illicit purchases, such as achieving unauthorized access to devices, facilities, or data.[27]

AI tools can be inaccurate, biased, and discriminatory by design and incentivize relying on increasingly invasive forms of commercial surveillance.[28] AI may harm consumers as a result of design flaws, inaccuracies, bias, discrimination, and commercial surveillance incentives.[29] For example, the National Institute of Standards and Technology published research showing facial recognition algorithms were prone to error when the user had a disability and produced false positives for: West African and East Asian faces more than that of Eastern European faces; women more than men; and children and elderly more than middle-aged adults.[30] Because this technology is often used to provide access to financial accounts or to determine when consumers receive benefits or are subject to penalties, false positives result in identity theft, false accusations, and an inability to access resources.[31]

In 2011, after a series of complaints filed by the FTC and EPIC, Facebook settled the charges in an eight-count complaint for unfair and deceptive practices against them, conceding that it deceived consumers by failing to keep privacy promises.[32] The FTC charged Facebook with repeatedly lying to users and manipulating them into sharing their personal data.[33] Within a few months of the FTC adopting the Final Order, Facebook had already resumed violating users’ privacy.[34] Despite this, the FTC did not charge Facebook with a violation until 2019, when Facebook gave access to 87 million user records to the data mining firm, Cambridge Analytica.[35]

In 2019, after the FTC fined Facebook five billion dollars for violating the Final Order, Facebook agreed to a second order taking effect in 2020.[36] The 2020 order expanded the required privacy program and independent third-party assessor’s role, required a privacy review of every new or modified product, service, or practice before implementation, required greater security for personal information, and imposed restrictions on the use of facial recognition software and telephone numbers obtained for account security.[37]

Earlier this year, the FTC formally accused Facebook in an Order to Show Cause, alleging Facebook violated both previous orders.[38] The proposed amendments include:

  • A blanket prohibition against monetizing data of those under 18;
  • Requiring written confirmation from the assessor before launching new products or services;
  • Requiring all companies Meta merges with or acquires to honor prior privacy commitments and the FTC orders;
  • Requiring affirmative consent for any future use of facial recognition software;
  • Broadening the requirement to provide conspicuous notice and to obtain the user’s affirmative express consent for changes in its data practices; and
  • Strengthening requirements from the 2020 order concerning privacy review, data inventory, access controls, and mandatory self-reporting.[39]

Is it starting to feel a bit repetitive? The FTC has repeatedly failed to adequately punish Facebook or prohibit the tactics used to violate consumer privacy.

None of the Order requirements positively impact consumer protections. Pausing new or modified products or services does not prevent Facebook from acquiring the data that allowed them to deceive consumers. Facebook used machine learning to gather and use data by tracking location data, mouse movements and interaction points, and data from Instagram and WhatsApp.[40] The data provided from that information allows algorithms to rapidly test consumer behaviors until the most deceptive and accurate advertisement, microtargeted at the consumer most likely to click, is used to grant permissions to collect and share data.[41]

Additionally, the proposed order does not prevent other businesses from engaging in the same practices. Facebook,  who earned $70 billion in revenue in 2019, was forced to pay a fine of five billion dollars.[42] Facebook was fined for knowingly violating consumer privacy within months of receiving the order and is now facing further action for continuing to violate the order before, during, and after receiving the second order.[43] The FTC’s consequences for manipulating consumers for their data is basically nonexistent when considering how much selling consumer data earns. The proposed order and lack of action against Facebook shows other companies engaging in the same behavior that the FTC will not prevent them from engaging in deceptive practices like Privacy Zuckering.

Next, in the proposed order, the FTC intends to strengthen prior policies and require compliance for all merged or acquired companies. As shown by the immediate and continuing violations of the first and second orders, the FTC’s bark is worse than its bite. EPIC reported Facebook resumed violating consumers’ privacy before the first order was finalized, and the FTC failed to take action until the Cambridge Analytica Scandal.[44]

Finally, the proposed order would require conspicuous notice and affirmative consent before changing privacy practices or using facial recognition technology. The effect of these amendments could be monumental to consumer privacy in the online marketplace. The only issue is that these rules have been the standard for many years, and Facebook was already subject to, and disregarded similar rules in the previous order. Further, the requirement of affirmative consent in using facial recognition software does not prevent the collection of biometrics or behavioral data from consumers. The FTC is not adequately addressing the unfair and deceptive practices harming consumers.

To better regulate dark patterns and consumer privacy, the FTC must enforce strict compliance and take further action to protect consumers. The FTC should consider deceptive acts or practices to be those designed to induce unwanted actions, have differential impacts on targeted groups, and impair the decision-making ability of consumers.

The FTC should consider the collection and use of sensitive information gained through dark patterns a material harm. Disclosures should be easy to understand and short. Companies should not engage in the behaviors hidden in disclosures or privacy consents, and terms of use should be easy for consumers to read and understand. Likewise, unwanted actions in the marketplace that result in consumers giving up out of frustration or fatigue, regardless of an actual purchase, should be a material harm. The FTC should focus on safeguarding vulnerable consumers by mandating companies using dark patterns and design element experimentation separately obtain express, affirmative consent.  The FTC should require companies to provide clear disclosures on the experimentation being done and obtain consent from consumers for future use of results.

While it is unlikely the FTC will immediately implement any of the above-proposed solutions, an effective start would be applying those rules and standards to Facebook and any other company that is attempting to get away with manipulating consumers.

Conclusion

In conclusion, consumers are being manipulated by advancing technologies’ ability to collect and read data and transform it into deceptive design elements tricking consumers to unwillingly share personal data. The FTC must enforce strict compliance to prevent businesses from further violating consumers’ free choice in purchasing and privacy. The FTC must enforce harsh restrictions on Facebook to prevent other companies from following its path. The current regulations are inadequate to protect consumers from the advancement of dark patterns and deceptive design as a result of machine learning.


[1] Did I make you click?

[2] FTC Bureau of Consumer Prot., Bringing Dark Patterns to Light (Sept. 14, 2022) [hereinafter Staff Report].

[3] Press Release, FTC, FTC Report Shows Rise in Sophisticated Dark patterns Designed to Trick and Trap Consumers (Sept. 15, 2022), http://www.ftc.gov/news-events/news/press-releases/2022/09/ftc-report-shows-rise-sophisticated-dark-patterns-designed-trick-trap-consumers.

[4] Types of Deceptive Pattern, Deceptive Designs, (Last visited, Sep. 15, 2023), https://www.deceptive.design/types.

[5] Staff Report, supra note 2, at 2.

[6] Id. at 3.

[7] Id. at 2.

[8] Alind Gupta, Targeted Advertising using Machine Learning, Geeks for Geeks (Mar. 1, 2023), https://www.geeksforgeeks.org/targeted-advertising-using-machine-learning/.

[9] Machine learning is a computer system’s ability to learn and change itself based on algorithms that analyze patterns in massive amounts of data. The AI software rapidly tests dark patterns against consumers to determine behavioral characteristics that result in an algorithmic bias, and then use that data to make dark patterns more effective and more difficult to spot. Staff Report, supra note 2, at 1-3.

[10] In predictive analytics, machine learning analyzes consumer behavior and purchasing patterns to predict which users are most likely to engage with certain ads or products. Gupta, supra note 8.

[11] Id.

[12] Privacy Zuckering, Deceptive Designs, (Last visited, Sep. 15, 2023), https://old.deceptive.design/privacy_zuckering/.

[13] Id.

[14] Staff Report, supra note 2, at 1-3.

[15] 15 U.S.C. §45(a)(1).

[16] 15 U.S.C. §45(n).

[17] Id.

[18] Id.

[19] Id.

[20] COPPA, requires companies to protect children’s privacy and safety online, including by getting parental consent before collecting some types of information from kids under 13. ROSCA requires the seller to clearly and conspicuously disclose all material terms of a transaction before obtaining billing information; obtain express, informed consent before charging the account; and provide simple mechanisms to stop recurring charges. Staff Report, supra note 2, at 9-11.

[21] Id. at 15.

[22] FTC, Policy Statement of the Federal Trade Commission on Biometric Information and Section 5 of the Federal Trade Commission Act (May 18, 2023), https://www.ftc.gov/system/files/ftc_gov/pdf/p225402biometricpolicystatement.pdf at 7.

[23] Id.

[24] The FTC defines “biometric information” as “data that depict or describe physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body.” Id.

[25] Id. at 1.

[26] Id.

[27] Id. at 3-4.

[28] Id.

[29] Id. at 4.

[30] Id. at 4-5.

[31] Id. at 5.

[32] Facebook’s 2011 FTC Consent Order, Elec. Priv. and Info. Ctr., (Last visited, Sep. 15, 2023), https://epic.org/facebook-2011-ftc-consent-order/ [hereinafter EPIC].

[33] Press Release, FTC, Facebook Settles FTC Charges That It Deceived Consumers By Failing To Keep Privacy Promises (Nov. 29, 2011), https://www.ftc.gov/news-events/news/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumers-failing-keep-privacy-promises.

[34] Press Release, FTC, FTC Proposes Blanket Prohibition Preventing Facebook from Monetizing Youth Data (May 3, 2023), https://www.ftc.gov/news-events/news/press-releases/2023/05/ftc-proposes-blanket-prohibition-preventing-facebook-monetizing-youth-data [hereinafter Monetizing Youth Data].

[35] In 2019, the FTC fined Facebook five billion dollars for violating the Consent Order. EPIC, supra note 32.

[36] Monetizing Youth Data, supra note 34.

[37] Id.

[38] Id.

[39] Facebook, Inc., Dkt. No. C-4365, 212 3091 (FTC, May 3, 2023).

[40] Gupta, supra note 8.

[41] Id.

[42] Mansoor Iqbal, Facebook Revenue and Usage Statistics (2023), Business of Apps, (Last visited, Sep. 15, 2023), https://www.businessofapps.com/data/facebook-statistics/.

[43] Monetizing Youth Data, supra note 34.

[44] EPIC, supra note 32.


Leave a comment

Blog at WordPress.com.

Up ↑