09/06/2021

Online interactions are part of almost every aspect of our lives, with most of us using online services for work, social, entertainment, education, financial and other purposes. With the online world becoming more ingrained in our lives, Australians of all ages are increasingly exposed to harms that occur online. 

On 24 February 2021, the Commonwealth Government introduced the Online Safety Bill into Parliament, which is aimed at improving online safety for Australians (the Bill). The Bill was agreed to in the House of Representatives on 16 March, and is currently before the Senate for consideration and passing. If introduced in its current form, it will have significant implications for many online service providers, introducing new measures designed to increase industry accountability for the online safety of end-users, whilst also giving the eSafety Commissioner enhanced powers to enforce the Bill effectively.

The Bill is also intended to consolidate a patchwork of existing online safety legislation to create a more harmonised regulatory framework. As such, the Bill works alongside the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021, which, among other things, repeals the Enhancing Online Safety Act 2015 (Cth) (EOSA), and the existing cyber-bulling scheme targeted at children; amends the Broadcasting Services Act 1992 (Cth) (BSA); and the Criminal Code Act 1995 (Cth) (as amended by the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019) (Criminal Code Amendment).

Who does the Online Safety Bill apply to?

The Bill applies to a very broad category of online service providers, including:

  • social media service providers– providers of online services with the sole purpose of enabling online social interaction between end-users (e.g. Facebook, Instagram, Tik Tok, Snapchat, LinkedIn etc.);
  • relevant electronic service providers– providers of electronic services that allow end-users to communicate with other end-users (e.g. Gmail, Outlook, WhatsApp);
  • designated internet service providers– providers of services that allow end-users to access material on the internet using an internet carriage service or a service that delivers material to persons by means of an internet carriage service;
  • internet service providers – suppliers of internet carriage services to the public (i.e. the service is used for the carriage of material between 2 end-users and each end-user is outside the supplier’s immediate circle) (e.g. Optus, iiNet);
  • app distribution service providers– providers of app stores that allow end-users to download apps (e.g.  Apple (IOS Apple Store) and Google (Google Play Store)); and
  • hosting service providers – providers of services that enable the hosting of stored material that is provided on a social media, relevant electronic service or designated internet service (e.g. cloud service providers such as Amazon and Microsoft).

The Bill also applies to end-users who post harmful material in certain instances, however, the focus of the legislation is largely on online service providers.

What does the Online Safety Bill seek to protect against?

The Bill seeks to enhance protections against specific ‘online harms’, including:

  • cyber-bulling material targeted at Australian children;
  • cyber-abuse material targeted at Australian adults;
  • non-consensual sharing of intimate images;
  • exposure online to class 1 material (‘Refused Classification’ under the Classification Act 1995 (Cth)) and/or class 2 material (‘X18+’ and ‘R18+’ under the Classification Act 1995 (Cth)); and
  • material depicting abhorrent violent conduct (e.g. terrorist acts, murder, attempted murder, torture, rape and kidnapping).

The different schemes under the Bill apply differently to each type of online service provider:

 

Summary: application of Online Safety Bill to online service providers

 

How does the Online Safety Bill enhance protections? 

  1. Removal notices – online harms

The Bill establishes a complaints-based, removal notice system for online harms.   Generally under this system, social media service providers, relevant electronic service providers, designated internet service providers and hosting service providers may be given a removal notice by the eSafety Commissioner, requiring them, within 24 hours, to remove, or take all reasonable steps to remove (depending on the type of harm concerned), certain material from their service, or in the case of hosting service providers, take all reasonable steps to cease hosting the material.

Generally, in order for a removal notice to be given, the material must:

  • be accessible to Australian end-users;
  • have been subject to a complaint to the relevant online service provider;
  • failed to have been removed by the online service provider after receipt of the complaint; and
  • subsequently be subject to a complaint to the eSafety Commissioner.

In some cases, a complaint only needs to be made to the eSafety Commissioner, rather than to the relevant online service provider in the first instance (for example, the scheme in relation to the removal of an intimate image that has been shared on a non-consensual basis).  In certain circumstances the eSafety Commissioner can also issue a removal notice without a complaint having been made (for example, in relation to class 1 and class 2 material).

Remedial notices may also be issued in respect of class 1 and 2 material. Instead of removal, these notices require the online service provider to take all reasonable steps to ensure that the material is removed from the service and access to the material is subject to a restricted access system.

The removal and remedial notice system under the Bill offers a number of key reforms to the existing online regulatory framework by:

  • extending the current scheme targeting cyber-bullying against children under the EOSA, which applied only to social media platforms, to a broader range of online service providers; and
  • establishing a completely new scheme targeting cyber-abuses against Australian adults.

The Bill requires compliance with removal and remedial notices within 24 hours of receipt of the notice. This is down from 48 hours, which was previously required for the children cyber-bullying scheme under the EOSA.

  1. Blocking notices - abhorrent violent conduct

The Bill establishes a new power for the eSafety Commissioner to require or request internet service providers to block access to material that depicts, incites or instructs abhorrent violent conduct. To issue a blocking notice, the eSafety Commissioner must be satisfied that the availability of such material online is likely to cause significant harm to the Australian community.

The intention behind this power is to prevent rapid online proliferation of abhorrent violent material, as occurred after the 2019 terrorist attack in Christchurch, New Zealand. This complements the eSafety Commissioner’s powers in the Criminal Code Amendment, under which failure to ensure the expeditious removal or cessation of hosting of abhorrent violent material can lead to large fines and criminal prosecution.

  1. Basic online safety expectations

Part 4 of the Bill articulates a set of Basic Online Safety Expectations (BOSE) that the Minister can determine apply to providers of social media services, relevant electronic services and/or designated internet services. The BOSE represent a significant expansion of the ‘basic online safety requirements’, which previously applied only to social media service providers under the EOSA.

The BOSE may include that, among other things, online service providers:

  • take reasonable steps to ensure end-users can use services in a safe manner;
  • minimise the extent to which certain types of material are provided on the service;
  • take reasonable steps to ensure that measures are in effect to prevent access by children to class 2 material;
  • have clear and readily identifiable mechanisms for end-users to report and make complaints about certain material; and
  • on notice, give the eSafety Commissioner information about complaints, timeframes to comply with removal notices and measures taken to ensure that end-users are able to use the service in a safe manner.

The core expectations set out in the Bill are not exhaustive, and the Minister may use their discretion to specify other expectations in a determination.

The eSafety Commissioner may require service providers to report (periodically or non-periodically) on their compliance with the applicable BOSE at regular intervals or during a specified period.

  1. App removal and link deletion notices under the modernised online content scheme

The Bill replicates and uplifts the online content scheme that currently sits in Schedules 5 and 7 of the BSA by bringing the providers of internet search engine services and app distribution services into the remit of the scheme. The Bill enables the eSafety Commissioner to, in specific circumstances, issue a link deletion notice or app removal notice to the provider of an internet search engine service or app distribution service, to cease (for a limited time period) providing a link or an app that depicts class 1 material.

The modernised online content scheme also strengthens the power of the eSafety Commissioner to deal with class 1 material (for example, under section 110 of the Bill, the ESafety Commissioner can give a hosting service provider a removal notice in relation to class 1 material, regardless of whether the material is hosted in Australia).

What are the consequences of non-compliance? 

The eSafety Commissioner has a range of enforcement options against online service providers that fail to comply with the Bill’s strict requirements, including civil penalties (up to 500 penalty units or AUD555,000 for body corporates), formal warnings, infringement notices, enforceable undertakings and injunctions.

Online Safety Bill: the key takeaways 

KEY TAKEAWAYS

The Bill both consolidates and reforms existing legislation, and gives the eSafety Commissioner new broad powers and the ability to effectively enforce them.

In addition to the eSafety Commissioner’s removal and blocking powers set out above, the eSafety Commissioner has the power to:

  • obtain contact or identifying information from online service providers about individuals using anonymous accounts to abuse, bully or share intimate images of others, without consent (in doing so the eSafety Commissioner must believe on reasonable grounds that the identifying information is relevant to the operation of the Bill); and
  • ‘do all things necessary or convenient to be done in connection with the performance of the eSafety Commissioner’s functions’.

We’ve written in more detail about the strengthened role of the eSafety Commissioner - The Most Important Regulator You May Have Never Heard Of.

The Bill, if passed in its current form, will have significant impacts for online service providers.  Online service providers (which includes an extremely broad range of service providers):

  1. must be aware of the breadth of these powers, from the issuing of removal notices, to obtaining end-user contact information in certain circumstances; and
  2. must be prepared to pro-actively protect end-users against online harms, or risk being subject to take down notices with 24-hour response times under the Bill.

In the UK, a similar Draft Online Safety Bill was recently released on 12 May 2021. While the objectives of the UK legislation are similar to the Bill, unlike the Bill, certain types of business-related services are exempt from the remit of the UK bill, including: email only services, SMS and/or MMS-only services, services offering only one-to-one live aural communications, internal business services, limited functionality services, and services provided by public bodies. As the Australian Bill is debated before the Senate we will see if similar exemptions are considered here in Australia.

Pending the Online Safety Bill’s passing, if you are caught by the broad scope of the Bill, it is important to take note of the new proposed laws (and their impact on pre-existing laws), and update your services, policies and processes accordingly.

 

Authors: Melissa Fai, Jen Bradley and Meaghan Powell 

Expertise Area
""