16/08/2022

Last year we wrote about the expansion of the eSafety Commissioner from a regulator of relatively narrow scope (combatting cyberbullying of Australian children), to a more generalist authority with responsibility for the online safety of all Australians. These powers expanded significantly under last year’s Online Safety Bill, with the finalised Online Safety Act 2021 (Cth) (OSA) commencing earlier this year.

The implementation of the OSA was something of a capstone to the former Coalition Government’s sometimes controversial (and sometimes unique) approach to regulating the online world. From the rapid legislating of abhorrent violent material laws in response to the Christchurch terrorist attack, to the News Media Bargaining Code, and the (ultimately unsuccessful) proposed anti-trolling reforms, the past few terms of Federal Parliament have dramatically shifted the dial in respect of how Australian law now expects service providers to respond to online harms and intermediary liability.

This article outlines the current the state-of-play in one of Australia’s most burgeoning regulatory areas, before turning to thoughts on how the new Labor Government may influence a change of course in this space.

Online Safety Act

The OSA reflects the Commonwealth Government’s most dramatic overhaul of online safety laws since the introduction of the eSafety Commissioner in 2015 (then just the ‘Children’s eSafety Commissioner’). In addition to endowing the eSafety Commissioner with several new responsibilities and powers, the OSA also reconciles a regulatory approach previously spread between the Enhancing Online Safety Act 2015 (Cth) (now repealed), Broadcasting Services Act 1992 (Cth), and the Criminal Code Act 1995 (Cth).

The OSA contains several regulatory schemes that seek to limit, prevent and help remediate various forms of online harms. Such harms include interpersonal harms such as cyber-bullying of an Australian child, cyber-abuse of an Australian adult, or image-based abuse, as well as harms relating to illegal and restricted material, which are covered by the OSA’s revised Online Content Scheme and abhorrent violent conduct powers.

The OSA has a wide and comprehensive scope, applying to providers of services in the following ‘sections of the online industry’:

Service / activity Summary Examples
Social media services Services that have the sole or primary purpose of enabling online social interaction between end-users, where end-users can also link to other end-users and post material on the service Social networking sites (Facebook), community forums (Reddit)
Relevant electronic services Services that allow end-users to communicate with other end users by means of email, instant messaging, SMS, MMS, chat services or online games. Text messaging, social messaging (WhatsApp) webmail services (Gmail), in-game chat
Designated internet services Services, other than a social media service or relevant electronic service, that allow end-users to access material on the internet using an internet carriage service or a service that delivers material to persons by means of an internet carriage service. Websites and other online services
Internet search engine services While not defined in the OSA, generally services that respond to end-user queries by collecting and ranking information on the world wide web. Search engines (Bing, Google)
App distribution services Services that enable end-users to download apps by means of a carriage service. Mass-market app stores (Apple App Store, Google Play Store)
Hosting services Services that enable the hosting of stored material that is provided on a social media, relevant electronic or designated internet service. Web hosting services, media hosting services
Internet carriage services Listed carriage services that enable end-users to access the internet Listed carriers (Optus, Telstra)

The OSA applies to these different industry participants in varying ways, with some industry sections (such as social media services) attracting more extensive obligations than others. Providers of equipment used in connection with a relevant electronic, social media, designated internet or internet carriage service may also attract obligations under future industry codes or standards made pursuant to the OSA (see below for more on this).

While the substantiative focus of the OSA is on the service providers described above, it should also be noted that some powers under the legislation apply to Australian end-users themselves (such as removal notices with regard to cyber-bullying or cyber-abuse material).

Basic Online Safety Expectations

In addition to imposing obligations on service providers to regulate illegal and restricted content, the OSA also seeks to increase industry action, accountability and transparency when it comes to online safety. One way this is intended to be achieved is through a set of ‘basic online safety expectations’ (BOSE), which the Minister is empowered to determine under the OSA by way of a separate legislative instrument. A formal BOSE Determination was registered on 23 January 2022, commencing alongside the OSA itself.

As stated in the initial BOSE consultation paper, the BOSE are a mechanism for the Commonwealth Government to “articulate its expectations of online services providers, on behalf of the community, to improve protections for users” and that “service providers are best placed to identify emerging forms of harmful end-user conduct or material, and choose the best way to address them on their service, in the most responsive way”.  As such, while the balance of the obligations under the OSA require online service providers to comply reactively to online harms, the BOSE establishes a higher standard for online service providers, designed to ensure that certain online service providers prioritise online safety by imposing on them minimum standards for pre-emptive and preventative action to minimise harm that can occur via their online services, such as cyber-bullying or exposure to illegal or restricted material.

Core vs additional expectations

The OSA requires the BOSE to include certain pre-specified, principles-based ‘core expectations’ set out in section 46 of the OSA. However, the OSA does not limit the expectations that can be specified by the BOSE. The BOSE Determination, therefore, articulates the required core expectations as well as a number of additional expectations (which generally expand further on a core expectation).

Taking reasonable steps

Most of the expectations are described as an expectation that the online service provider will take ‘reasonable steps’ to do or achieve a particular objective, which is targeted at minimising online harm. The BOSE Determination does not prescribe how the expectations will be met, with the intention being to preserve flexibility for the online service provider to determine the method to achieve the objective.  Nevertheless, and without limiting the expectation, the BOSE Determination sets out examples of reasonable steps that the online service provider could take for the purpose of giving effect to certain expectations.

Examples of BOSE

Examples of expectations contained in the BOSE Determination under the heading of ‘expectations regarding safe use’ include taking reasonable steps to:

  • Core Expectation: ensure end-users are able to use the service in a safe manner;
    • Additional Expectation: proactively minimise the extent to which material or activity on the service is unlawful or harmful;
    • Example of reasonable steps: develop and implement processes to detect and address unlawful or harmful material or activity on encrypted services;
  • Additional Expectation: prevent anonymous accounts being used to deal with material, or for activity, that is unlawful or harmful; and
  • Additional Expectation: consult and cooperate with providers of other services to promote safe use by end-users.

Other examples from the BOSE, this time falling under the heading of ‘expectations regarding certain material and activity’, include taking reasonable steps to:

  • Core Expectation: minimise the extent to which certain material (such as non-consensual intimate images, cyber-abuse material, or class 1 material) are provided on the service; and
  • Core Expectation: ensure that technological or other measures are in effect to prevent children from accessing class 2 material on the service;
    • Example of reasonable steps: implement age assurance mechanisms or conduct child safety risk assessments.

The BOSE Determination also sets out several further expectations regarding end-user reports and complaints, policies and procedures of providers, record keeping and information sharing with the eSafety Commissioner.

Who is subject to the BOSE, and are they enforceable?

Presently, the BOSE Determination only applies to providers of social media services, relevant electronic services and designated internet services, however, further categories of provider can be added in the future.  Although the OSA gives the Minister the power to determine separate BOSE for each class of online service providers set out above, the BOSE Determination applies equally to each of them at present.

The BOSE Determination does not impose a legally enforceable duty on online service providers, meaning a failure to meet the expectations does not in and of itself provide a basis for any regulatory action or claim in court.

However, applicable online service providers must comply with certain statutory reporting requirements of the eSafety Commissioner in relation to the BOSE. These reporting requirements may occur via a determination of the eSafety Commissioner for a class of online service provider (no such requirement has been made at this time), or by notice to individual online service providers from time to time. In either case, the eSafety Commissioner can require the online service provider to prepare and provide either a one-off or periodic report about the extent to which the online service provider is complying with one or more specific BOSE. A failure to comply with these reporting requirements:

  • carries a civil penalty of up to 500 penalty units; and
  • the eSafety Commissioner may publish statements on its website about an online service provider where the eSafety Commissioner is satisfied they have failed one or more BOSE, or where they have failed to comply with the reporting requirements.

Industry Codes

In addition to the BOSE, and as part of the Online Content Scheme, the OSA also provides for the creation of industry codes, which are to be developed by industry participants and associations, and then approved and registered by the eSafety Commissioner. These industry codes are intended to apply across the entire gamut of online industry service providers, from social media services to those responsible for equipment manufacture, installation and maintenance.

In September 2021, the eSafety Commissioner published its Position Paper on the development of industry codes under the OSA. In addition to setting out a series of positions that the eSafety Commissioner expects industry to deal with through the codes, the Positions Paper also laid out an indicative timeline for the drafting, consultation and, if approved, registration of industry codes.

The eSafety Commissioner has proposed that two industry codes be produced:

  • one dealing with material considered to be ‘class 1A or 1B’ material (due for registration in 2022); and
  • another dealing with material considered to be ‘class 1C’ or class 2 material (due for registration in 2023).

While the OSA only defines class 1 and class 2 material, the eSafety Commissioner developed these further subcategories to distinguish between different material within each class, recognising differential harm profiles and industry responses. Generally speaking, class 1A represents the most extreme and harmful material online, including child sexual exploitation and pro-terror material, whereas class 1B includes material that provides instruction on matters of crime, drugs and violence. Classes 1C and 2A are focused on different types of pornography, and class 2B includes simulated pornography as well as other high impact material.

Work is currently underway by industry participants and representative associations to produce these industry codes. While the final form of the industry codes is yet to be settled, they are expected to materially alter compliance expectations for large swathes of the online ecosystem (and not just traditional tech players). Once registered, industry codes are able to be enforced through written directions issued by the eSafety Commissioner, of which noncompliance may attract significant financial penalties.

Looking ahead

The newly elected Labor Government has not indicated any significant changes to the OSA at this stage, stating that it was too early to review the Act (which received bipartisan support after opposition and crossbench amendments). Labor has for years whilst in opposition campaigned for stronger government responses to online harms, particularly as they relate to children. For instance, the now Minister for Communications, the Hon Michelle Rowland MP, previously questioned why the eSafety Commissioner had failed to exercise long-held statutory powers to combat cyberbullying of Australian children.

It remains to be seen how the Minister and the Labor Government will approach online safety more broadly during this parliamentary term. Early reported signals from the Minister point to an increased focus on misinformation as a threat to the health, safety, trust and cohesion of the Australian public. Misinformation, while not regulated by the OSA, forms part of the broader online safety equation in Australia and abroad.

Information gathering from online platforms has been flagged by the Minister as an initial hurdle in overcoming the challenges presented in regulating misinformation, with voluntary industry codes on disinformation and misinformation claimed to be undermined by a lack of a legislated ability for the Australian Communications and Media Authority to obtain related information. It is hoped that robust information gathering powers would help inform government on appropriate next steps in the fight against misinformation.

On its face, this approach seems to reflect a more gradual and pragmatic approach than has been witnessed in online policy making in the recent past. The end of 2021 saw a spike in what some commentators perceived as an uncoordinated and overlapping approach to online law reform, with both the Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (OP Bill) and Social Media (Anti-Trolling) Bill 2022 (SMAT Bill) both being proposed for introduction before the former Coalition Government left office.

Today, it seems neither the OP Bill or the SMAT Bill will be carried forward by their originating department, the Attorney-General’s Department (AGD). Returning Attorney-General, the Hon Mark Dreyfus QC MP, has instead set the AGD’s sights on the broader reform to the Privacy Act 1988 (Cth), which has potential for overlap with online safety concepts (particularly as they relate to the data and monitoring of children).

G+T will continue to monitor new developments in this space, including how the significant new powers and instrumentalities of the OSA play out in practice.

Authors: Melissa Fai, Jen Bradley, Bryce Craig, Nathan Allan

Expertise Area
""