01/03/2021

A fundamental function of any government is to provide for the safety and wellbeing of its citizens. While we commonly think of law enforcement, emergency services and public healthcare as the primary tools used by governments to meet this function, in recent years governments around the world have also grappled with how to keep their people safe online.

Given that almost all of us now socialise, work, transact, learn and, in the case of young people, grow-up on the internet, it is perfectly logical (and reasonable) to expect governments to respond to the unique challenges that arise in the digital world. Such challenges range from cyberbullying, image-based abuse and doxing, to the sharing of misinformation and other objectionable content. You might be surprised to know that the world’s first regulator dedicated to online safety emerged in Australia: the eSafety Commissioner.     

Enter the eSafety Commissioner

This regulator was originally introduced as the Office of the Children’s eSafety Commissioner under the Enhancing Online Safety for Children Act 2015 (Cth) (EOSA). Developed after significant public consultation dating back to January 2014, the original legislation had a clear focus on responding to cyber-bullying against children.

The Children’s eSafety Commissioner was empowered to field complaints of this nature and take practical steps through issuing take-down notices to social media platforms and their users. Establishing such a regulator signalled a new level of government acknowledgment of, and advocacy against, cyberbullying in Australia. Structurally, the Children’s eSafety Commissioner was set up as an independent statutory authority within the Australian Communications and Media Authority (ACMA), and also took over responsibility for the Online Content Scheme (OCS).

Significant reforms occurred in 2017, when a series of amendments to EOSA saw the regulator’s limited focus on children dropped, and the now ‘eSafety Commissioner’ charged with safeguarding the online safety of all Australians. However, enforcement powers were still limited to managing complaints relating to the cyberbullying of children and OCS.

From 2018, EOSA was again amended to bolster the eSafety Commissioner’s ability to respond to the pervasive societal challenge of non-consensual sharing of intimate images, otherwise known as sextortion, revenge-porn, or image-based abuse. The EOSA now makes the actual or threatened posting of intimate images a civil penalty provision with a designated complaints and objections framework, as well as a framework for requiring the removal of offending content.

The current core functions of the eSafety Commissioner are described below:

Cyberbullying

As mentioned above, the eSafety Commissioner arose out of a mandate to officially deal with cyberbullying. Presently, the eSafety Commissioner’s powers with regard to cyberbullying is still limited to incidents involving a target who is an Australian child.

Impacted children or their authorised representatives are encouraged to lodge complaints to the eSafety Commissioner where they have been the target of cyberbullying, or seriously threatening, intimidating, harassing or humiliating behaviour online. Each complaint is assessed on a case-by-case basis, and if the eSafety Commissioner determines that they need to take an action, a number of options are available.

Principally, EOSA creates a scheme whereby the eSafety Commissioner can request social media services remove offending conduct from their service. Social media services are categorised into one of two tiers, with resulting consequences for the level of regulatory intervention available. Tier 2 services may be subject to legally binding notices for the removal of offending content as well as civil penalties for non-compliance with requests. These include the larger social media services, being Facebook, Instagram and YouTube. Tier 1 services participate in the scheme voluntarily and include a broader range of services, from mainstream social media players TikTok, Snapchat and Twitter, to forums like Yahoo 7 Answers, and gaming platforms like Roblox.

In addition to sending notices to social media services, the eSafety Commissioner can also issue notices to the individuals who themselves post cyberbullying material. Other enforcement options include directly contacting appropriate schools and parents, or making a referral to relevant police forces for potential criminal investigation.

When it comes to cyberbullying against adults, the eSafety Commissioner fields complaints in a similar way, however does not yet have legislative powers with regards to content removal (except in the case of image-based abuse).

Image-based abuse

Image-based abuse primarily centres around the non-consensual distribution of images (whether still or video) where a person is nude, partially nude or engaged in sexual activity. However, the definition of ‘intimate image’ in EOSA is broad, and extends to include images of a person in circumstances where they would ordinarily expect to be afforded privacy, such as using the toilet or getting changed. Also considered intimate images under EOSA are images of a person who would normally wear particular attire due to their religious or cultural background, without that attire on, and in circumstances where one would reasonably expect privacy.

For the purposes of interpreting the act, where material depicts, or appears to depict, part of the body of a person, the material will be taken as depicting or appearing to depict the person. Additionally, whether or not an image has been altered is not material under EOSA.

The eSafety Commissioner administers a civil penalty scheme with regard to image-based abuse, as well as a complaints and take-down scheme similar to that used for cyberbullying. Complaints are only considered where the person in the image lives in Australia, the person who posted the image lives in Australia, or the image is hosted in Australia.

In response to a complaint, the eSafety Commissioner may issue an enforceable removal notice to the relevant platform hosting the image, which extends beyond social media services to include individual end-users as well as other relevant electronic services, designated internet services or hosting service providers.

The eSafety Commissioner can also take more direct action against a person who posts or threatens to post an intimate image of another person online, including by issuing infringement notices, enforceable undertakings or requesting a court injunction. Posting or threatening to post an intimate image, or failure to comply with a removal notice, is a civil offence subject to a maximum penalty of 500 penalty units ($111,000 as at the time of writing).

Online Content Scheme

As mentioned above, the eSafety Commissioner also administers the OCS contained in the Broadcasting Services Act 1992 (Cth) (BSA). Originally set up as a complementary scheme to traditional broadcast content regulation, the OCS empowers the eSafety Commissioner to investigate complaints about online content that is illegal or offensive. Complaints can be made by an individual residing in Australia, a body corporate carrying on business here, the Commonwealth, or a state or territory.

Determining what is offensive, or ‘prohibited content’, requires an assessment of where such content is likely to fall in the standard classification spectrum. Where content is or is likely to be classified above MA15+, including R18+ and RC (refused classification), it is required to be kept behind an age-gated restricted access system. If satisfied that a piece of content is prohibited and not appropriately restricted, the eSafety commissioner may issue a take-down notice to a local host of the content, or access-prevention notice to a local ISP of content hosted overseas. Failure to comply with such notices can result in substantial financial penalties.

While the OCS catches a wide range of service providers in its scope, from streaming platforms, to dating apps and social media sites, in practice the eSafety Commissioner allocates much of its resources towards the investigation and removal of the most extreme forms of illegal content, such as child abuse material, terrorist propaganda and material that incites violence.

Abhorrent Violent Material

The final core function of the eSafety Commissioner is the assessment of content that fits the legislative definition of abhorrent violent material (AVM) under the Criminal Code Act 1995 (Cth). The AVM amendments were swiftly introduced in the wake of the 2019 Christchurch terrorist attack, which saw the perpetrator live-stream the mass shooting, which later spread virally online.

AVM is defined as any material created by a perpetrator or accomplice that depicts a terrorist act linked to serious injury or death, a murder or attempted murder, torture, sexual assault or violent kidnapping. The amendments create criminal offences for content, internet and hosting providers who fail to remove or report AVM, which are enforced by Commonwealth law enforcement agencies.

The eSafety Commissioner’s role only extends to reviewing complaints of AVM and, where appropriate, issuing notices to a website that publishes or service that hosts said AVM. These notices are not mandatory take-down notices and do not compel action on behalf of recipients. However, failure to comply with a notice request can prejudice a recipient in any later related criminal proceedings.  

A Widening Scope for eSafety

The Australian Government is currently in the process of finalising an Online Safety Act (OSA), with a consultation period on the exposure draft recently ending. Features of the OSA as it is currently proposed include:

  • Enhancing existing cyberbullying protections for children in the EOSA by expanding their application across a more relevant range of services;
  • Adding a world-first protection against cyber-abuse targeted at Australian adults, empowering the eSafety Commissioner to compel the removal of seriously harmful online abuse where they have not already done so in response to a complaint;
  • Enhanced powers with relation to the existing OSC, enabling the eSafety Commissioner to require search engines and app stores to delist services that systemically ignore take-down notices for ‘class 1’ content, such as child abuse material;
  • Adding a new power for the eSafety Commissioner to rapidly block websites that host AVM or terrorist material during a crisis event;
  • Reducing existing timeframes for compliance with a range of OSA obligations from 48 to 24 hours; and
  • Establishing a set of basic online safety expectations for industry participants that are codified at law and incorporate mandatory reporting requirements.

OSA will also require greater industry self-regulation through the development of industry-specific codes when it comes to online safety. With the Australian Government due to respond to the consultation, and potentially introduce the final bill to Parliament, in the coming months, the eSafety Commissioner may find itself a much more powerful regulator when it enters 2022.

 

Authors: Andrew Hii and Bryce Craig

Expertise Area
""