28/01/2022

On 20 January 2022, the European Commission passed final amendments to the Digital Services Act (DSA) which, in conjunction with the proposed Digital Markets Act (DMA), constitute some of the most significant reform to internet platform legislation in recent times. The DSA will now proceed to negotiation with EU member States, and if made into law, the DSA will have a profound impact on the way digital services are provided in the European Union.  Its coverage extends to all online intermediaries that offer their services in the EU Single Market, including internet access providers and domain name registrars, as well as hosting services and online platforms, regardless of whether they are established in the EU or elsewhere.

One of the main goals of the DSA is to create a safer digital space in which the fundamental rights of all users of digital services are protected.  The DSA introduces reforms through the handling of harmful online content, the protection of online users fundamental rights, restrictions on the collection of personal data for advertising purposes and limitations on online behavioural advertising (see here for more the online behavioural advertising changes).

In line with its goal of protecting EU fundamental rights (in particular freedom of expression and of information), a key area of reform under the DSA is in relation to dark patterns, which are used by online service providers to nudge or pressure users towards making particular decisions. While you have most likely interacted with the somewhat ominous sounding “dark patterns”, you could be excused for not knowing what they are - you can get up to speed on what they are and some of the techniques used to influence user behaviour in our previous article - Regulators shine a light on dark patterns.  The DSA has banned providers of intermediary services from using deceiving or nudging techniques on recipients of their services; and from using dark patterns to distort or impair user autonomy.  Despite a proposal for a similar amendment to the DSA failing in December 2021, the most recent proposal passed a committee vote on 20 January 2022.

Changes to dark Patterns under the DSA

Under the DSA, providers of intermediary services would be restricted from using their online interface (either through structure, design or functionality), to impair the users’ ability to make free, autonomous and informed decisions or choices (Article 13a). The Recitals of the DSA state that recipients of services should be empowered to make decisions about matters such as acceptance and changes to terms and conditions, advertising practices, privacy and other settings and recommender systems without being subjected to practices which exploit cognitive biases prompting users to purchase goods or services they do not want or to reveal personal information they would prefer not to (Recital 39a).

While the prohibition is general in nature, the DSA specifically refers to limiting intermediary services from practices such as:

  • giving unequal visual prominence to any consent options when asking the user for a decision (a form of misdirection technique);
  • repetitively requesting or urging the receipting to make a decision such as repeatedly requesting consents to data processing where consent has previously been refused (especially in the form of a pop-up that interferes with the user experience) or has been refused through the use of automatic refusal configurations;
  • urging a user to change a setting or configuration after the user has already made a choice; or
  • making the procedure to cancel a service significantly more cumbersome than signing up to it (a form of obstruction technique).

The DSA provides that the European Commission may adopt a delegated act to update the specific list of referred practices.

The DSA does not affect the requirements for consent to processing of personal data under the GDPR, including that consent must be ‘freely given, specific, informed and unambiguous’.  However these reforms look to address a perceived grey area in the GDPR around the use of dark patterns to influence the obtaining of user consent.  For example, the ban on dark patterns would include the use of significantly difficult to reject “cookie walls”, where websites have made it increasingly more difficult to reject cookies rather than accept them by employing tactics such as the need to make multiple clicks to display the reject button, or by making the reject button pale in comparison to its “accept” counterpart.  Rather, websites will need to ensure equal preference is given to the “accept” and “reject” options (see here for more on cookies). 

The DSA also requires very large online platforms that provide systems that make recommendations to users (through suggesting, ranking and prioritising information, or the use of distinguishing techniques either through text or visual representations or otherwise curate information), to implement technical and organisation measures to ensure such systems are designed in a consumer friendly manner and do not influence end users’ behaviour through dark patterns.

Where to on Dark Patterns here in Australia 

The EU is not the only jurisdiction looking to reign in the use of dark patterns.  The California Consumer Privacy Rights Act (CPRA), which is due to take effect until 1 January 2023, has specifically called out dark patterns in the context of valid user consent to data processing, by amending the existing California Consumer Privacy Act (CCPA) to include a specific mandate that ‘agreement obtained through use of dark patterns does not constitute consent’ under the CCPA. Beyond this initial step, in March 2021, California’s Office of Administrative Law also approved regulations designed to ban the use of dark patterns which obscure the process for opting-out of the sale of personal information or which impair a consumers’ choice to opt-out.

The proposals in the EU and the US may accelerate dark pattern reform here in Australia (see our recent article - Is this the start of a global war on Dark Patterns?). The ACCC has previously indicated in its third Digital platform services inquiry report, released in September 2021, potential measures that could be adopted by search engines and browsers to mitigate the use of dark patterns and set out a recommended prohibition to capture ‘conduct that is particularly harmful to consumers, and significantly impedes consumer choice’.  Further, submissions to the Privacy Act review Issues Paper also identified that some entities are deliberately using dark patterns to undermine consumer autonomy in Australia, and that various measures discussed in the Privacy Act Review’s Discussion Paper, including more stringent criteria for what constitutes consent, might also limit the ability for companies to use dark patterns. Dark patterns are firmly on the radar on Australian Regulators.  In the meantime, focus will be on the development of the DSA as negotiations progress with EU member states.

 

Authors: Tim Gole, Jen Bradley, Ziggy Liszukiewicz and Tara Walsh

Expertise Area
""