Go to our Contact page for our office details.
The Department of Home Affairs has issued its draft guidance “Modern Slavery Act 2018: Draft Guidance for Reporting Entities” (Draft Guidance) for the new Modern Slavery Act 2018 (Cth) (the Act).
The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Cth) (the Act) was passed by the Australian parliament last Thursday (4 April), in response to livestreaming and subsequent sharing of footage of the Christchurch terror attack and it commenced on 6 April. The legislation amends the Criminal Code Act 1995 (Cth) (Criminal Code) to criminalise the hosting and streaming of ‘abhorrent violent material’ in certain circumstances.
The Act is the 2nd major legislative reform affecting the tech industry passed in recent months, the encryption legislation having passed late last year. What these 2 pieces of legislation have in common is that they have both been subject to wide-spread criticism by industry and law bodies, both in terms of the manner in which they were passed (given the speed of their respective passage and lack of consultation), as well as their substance. They reflect what might politely be termed as an “agile” or “fail fast” approach to legal reform – laws being passed and at the same time referred to the PJCIS for further review.
We have compared some of the public commentary surrounding the Act, with what the Act actually says.
Much of the commentary in the lead up to this Act focused on the possibility of prison terms for social media executives, whose platforms failed to expeditiously remove offensive material. However, the Act does not penalise social media executives.
It is only where a natural person provides a content or hosting service that imprisonment is an available punishment.
Where a corporate entity provides the content or hosting service, the Act imposes financial penalties only (up to $10.5 million or 10% of annual global turnover).
Limited to social media?
The Act covers internet service providers, content service providers and hosting service providers. The Act is not limited to social media. At a high level, this would include:
Examples of services covered by these laws include Gmail, Google Drive, DropBox and Microsoft OneDrive. For many of these services, the service provider typically has no or very little visibility over what content is being stored or communicated (and this reflects the expectations of users).
Where relevant ‘abhorrent violent material’ is being live-streamed or otherwise displayed on a person’s social media page, then it is obvious what the law is seeking to prevent (even if the drafting of the legislation is unclear in some areas). However, where the material is in a user’s private email or storage account and not being publicised (the Act simply requires that the service is used to “access” the material), it is far less clear what behaviour the law is seeking to change – particularly as the Act is focused on the providers of those services, and not their users.
The Act outlaws ‘abhorrent violent material’ and relies heavily on specific legal concepts in defining what falls within this concept. Putting that issue to one side, the Act limits itself to recordings of the criminal conduct only. It does not extend itself to the accessibility of extremist material more generally. So while the legislation may have resulted in social media sites taking down footage of the Christchurch attack more quickly than they did, it would likely have done very little to prevent the perpetrator from accessing the type of extremist materials that may have inspired the violent acts in the first place.
In referencing the Christchurch attack in the Bill’s second reading speech, Attorney-General Christian Porter described the 69-minute period it took Facebook to take down the material, after it was made aware, ‘unacceptable'. However, the Act merely requires those subject to it to act ‘expeditiously' It will be a matter for the courts and juries to determine what is expeditious in particular circumstances.
While the government has faced criticism over the Act, it is clear that the government was responding to a public perception that social media companies could have done more to stop their services becoming platforms for violent extremist propaganda. Legislation like this reflects the view that social media companies have failed to provide leadership on issues that governments and the public have legitimate concerns over. The recent statement by Facebook founder Mark Zuckerberg calling for governments and regulators to have a more active role in online content regulation is perhaps a sign that the tide is turning.