Companies that are required to comply with the California Consumer Privacy Act (CCPA) may need to rethink how they obtain consent from consumers, in light of a new law, approved by Californian voters in November last year and which is due to come into effect in 2023, that will revise the definition of consent under the CCPA to preclude any consent obtained through the use of dark patterns.
What are dark UI patterns?
Dark patterns are elements of digital user interface (UI) which are designed to take advantage of inherent psychological biases and lead users towards making certain choices. In user-centred UI design, the goal of the designer is to maximise usability and enhance a user’s experience of a digital product or service. Dark patterns are carefully crafted UI design features which invert the user-centred goal to instead influence users to make choices which maximise the interests of the online service provider (often without the user’s awareness).
There are a range of different categories of dark patterns (and for a comprehensive review of these we would recommend the report of the Norwegian Consumer Council); however, in summary dark patterns may include:
Misdirection techniques, which can involve visuals or language being used to direct users away from a choice. For example, an online service provider may make an option that is favourable to the provider more prominent by using a large, colourful button, and the option which is less favourable to the provider less prominent, by using small, grey font placed to the side.
Readers will likely recognise a number of these techniques. For example, when buying flights online, users are often taken to a screen which presents a plane map and invites a user to select a seat (for a fee), whilst at the bottom edge of that page a much smaller link allows the user to skip this step (for no fee).
Alternatively, techniques can involve a user being presented the option to sign up for a newsletter to receive a discount on their purchase, where the ‘decline’ option is framed as “No thanks, I’d rather pay full price!”
Scarcity techniques, which can involve an online service provider indicating that there is low stock or that a product or service is currently in high demand, creating a (sometimes false) sense of urgency for the user to complete their purchase.
Obstruction techniques, where a provider makes it extremely easy to sign up to and join a service, but incredibly difficult to cancel that membership or subscription.
“Default” settings being set at the maximum level of data sharing or most privacy-intrusive option.
No doubt, using and capitalising on human psychology to drive profits or to encourage consumers to engage in certain behaviour is standard practice – many of us are aware that fast food outlets use the colour red to drive appetite, or that store layouts are designed to make customers walk past a larger number of products than might otherwise need to be the case . However online, dark patterns can be used in subtle ways and sometimes problematically, to mislead users.
Regulation of dark patterns
While dark patterns are often used to drive users to purchase more goods and services, dark patterns have gained particular notoriety for the way they are occasionally used by online service providers to steer users into sharing more data or choosing more privacy-intrusive options.
This is one of the harms which California are seeking to specifically prohibit, by providing that consent will not be sufficiently obtained for the purposes of the CCPA where obtained through the use of dark patterns. However, it is not yet clear which dark patterns will be targeted under this new law and how consent obtaining pursuant to those will be distinguished from consent obtained ‘fairly’.
California aren’t the only jurisdiction to take aim at dark patterns. Data protection authorities in Europe have also considered the impact of dark patterns on the validity of consent. In 2019, France’s data protection authority, the Commission nationale de l'informatique et des libertés, published a report (Shaping Choices in the Digital World) stating that the use and abuse of “a strategy to divert attention or dark patterns can lead to invalidating consent.” Other European bodies have also queried how UI design features which clearly obfuscate privacy-enhancing choices, or which set privacy-intrusive options as default, could possibly sit with the requirements of Article 25 of the GDPR, which requires data protection by design and data protection by default.
The Australian context on dark patterns
Although Australian regulators have not yet focused on dark patterns as publicly as their counterparts in California or Europe, they are clearly aware of these techniques.
In its December 2020 submission to the Australian Government’s review of the Privacy Act 1988 (Cth) (Privacy Act), the Office of the Australian Information Commissioner (OAIC) identified that “some APP entities operating online use so-called ‘dark patterns’ designed to nudge individuals to consenting to more collections and broader uses of personal information”, in a way which the OAIC considered could limit the usefulness of consent as a privacy protection. The Australian Competition and Consumer Commission (ACCC) also referred to dark patterns in its Digital Platforms Inquiry Final Report (see our article 'Digital Platforms Services Inquiry Interim Report: 7 things you need to know').
Existing privacy and consumer protection laws in Australia may be sufficiently broad to address any problematic use of dark patterns by online service providers. For example, where dark patterns are used in a way that means that informed, voluntary, current and specific consent is not obtained where required, such conduct could be considered an interference with privacy under the Privacy Act. Alternatively, where the use of dark patterns constitutes conduct which is misleading or deceptive, that conduct could be a contravention of the Australian Consumer Law. To that end, new regulation may not be strictly necessary to capture such conduct although neither of the existing legal frameworks go as far as the proposed Californian law in specifically addressing the unique issue posed by dark patterns.
This is not to say that the use of dark patterns is unlawful per se. To the contrary, dark patterns are prolific and commonly used in Australia, as well as globally (as was noted by the OAIC in its submission). However, now more than ever, the practices of online service providers are being scrutinised from both a privacy and a consumer protection lens and in particular, the ACCC has demonstrated its willingness to pursue enforcement (including in respect of matters which may have traditionally been thought to fall within the remit of the privacy regulator). If the ACCC Chair Rod Sims’ 2021 Compliance and Enforcement Priorities are any indicator, “more cases will follow”. As such, if entities do choose to employ these techniques, it is important that they consider the overall effect that dark patterns may have on the agency of users that are asked to make choices on that entity’s website or software, and how valid (i.e. how freely and informed) any consent obtained from users may be.
Authors: Sophie Bogard, Nikhil Shah and Tim Gole