Earlier today the Attorney-General released the Privacy Act Review Report (the Report). The Report contains 116 proposals for reforming the Privacy Act 1988 (Cth) (the Privacy Act). These proposals aim to make, in the Attorney-General’s words, the Privacy Act “fit for purpose” to “adequately protect Australians’ privacy in the digital age”.
The Attorney-General first announced that the Australian Government would conduct a review (the Review) of the Privacy Act in December 2019. The Review aimed to investigate the effectiveness of Australia’s current data protection regime to ensure it “empower[s] consumers, protect[s] their data and best serve[s] the Australian economy”. Now, following the publication of an issues paper in October 2020, a discussion paper in October 2021 and several rounds of public consultation, the long awaited Report has been released.
The 116 proposals are described at a principles level. The Report does not attach an exposure draft of any reform legislation and many of the proposals are marked as being subject to further consultation. While the Report gives us a clearer picture of the future direction of the Privacy Act, there are still many important details that need to be filled in.
Despite this lack of detail, it’s becoming very clear that the upcoming Privacy Act reforms will require businesses to make substantial changes to the way they interact with individuals and handle personal information.
Highlights of the Privacy Act Review Report
- The requirement to act fairly and reasonably when collecting, using and disclosing personal information (Proposal 12). The Report stresses that this requirement will be judged on an objective standard and will apply regardless of any consent – meaning that tick boxes and privacy policies will not cure inappropriate data collection and use. Helpfully, the Report lists a number of factors to be taken into account when determining whether any collection, use or disclosure of personal information is fair and reasonable. This broad fairness concept mirrors the ACCC’s continued advocacy for a general prohibition on unfair trading practices in the fifth Digital Platform Services Inquiry.
- Amended definition of consent (Proposal 11) to make it clear that consent must be voluntary, informed, current, specific and unambiguous – which is the same standard of consent contained in our existing APP Guidelines. However, there is no proposal to change the circumstances in which an APP entity is required to obtain consent and the Report notes that consent does not need to be express, implied consent may still be relied upon (provided the implied consent is ‘unambiguous’). The Report also proposes that the OAIC develop guidance on how online services should design consent requests – which could result in a major UX re-design process for many online services.
- Broader definition of personal information (Proposals 4.1 - 4.4). The Report proposes changing the word “about” in the definition of personal information, to “relates to” (that is, “information or an opinion that relates to an identified individual…”). This change would allow the definition to capture a broader range of information. The change would also bring the definition in line with other Commonwealth legislation that uses ‘relating to’ when regulating information on privacy (for example, the Competition and Consumer Act 2010 (Cth) and the Telecommunications (Interception and Access) Act 1979 (Cth)) and bring the Privacy Act definition in line with the language used in the GDPR definition of ‘personal data’. The Report also proposes that any inferred or generated information will be deemed to have been ‘collected’ within the meaning of the Privacy Act. This will have important consequences for the AI industry.
- Direct right of action to enforce privacy rights (Proposal 26). The Report proposes a direct right of action for individuals who have suffered loss or damage as a result of an interference with their privacy. This would allow individuals (and representative groups) to seek compensation in the Federal Court or the Federal Circuit Court. Importantly, this direct right of action is not proposed to replace the existing complaints process and individuals will have to make a complaint to the OAIC prior to commencing court action.
- Additional obligations around de-identified information (Proposals 4.5 – 4.8). The Report proposes extending APP 11.1 (obligations to protect de-identified information from unauthorised access or interference) and APP 8 (obligation to take steps reasonable in the circumstances to ensure overseas recipients do not breach the APPs) to apply to de-identified datasets. The Report also recommends prohibiting APP entities from re-identifying de-identified information received from a third party and introducing a new criminal offence for “malicious” re-identification intended to harm or cause illegitimate benefit. This may impact organisations that rely on anonymisation and de-identification to perform data analytics, including the AI industry.
- Tighter timeframes for Notifiable Data Breaches (Proposal 28). The Report proposes that the deadline for reporting eligible data breaches to the OAIC will be reduced to (a GDPR-familiar) 72 hours from when the entity becomes aware that there are reasonable grounds to believe that there has been an eligible data breach. Notification to impacted individuals must be completed ‘as soon as practicable’. Under the existing regime, where an entity has reasonable grounds to suspect, but does not yet believe, that an eligible data breach has occurred, it has a 30 day period to make an assessment of the breach. The Report also proposes that any statement issued to the OAIC or any individual about an eligible data breach must set out the steps the entity has taken or intends to take in response to the breach.
- Additional obligations when handling employee records (Proposal 7). Some businesses may give a sigh of relief that the employee records exemption is to be retained, but on a more nuanced basis – that is certain Privacy Act obligations will be extended to private sector employees. In particular, obligations relating to transparency of collection and use of employee information, protection against unauthorised access or interference, and eligible data breach reporting. The Report flags that further consultation is required to determine how this should be implemented in legislation and hints that it could use either the architecture of the Fair Work Act or the Privacy Act. The nature of Australia’s current employee records exemption is speculated to be a major barrier for achieving GDPR adequacy status, so it may be surprising to some to see that the exemption will be mostly retained.
- Introduction of the concept of processors and controllers in Australian law – to make it more akin to other jurisdictions, most notably the GPDR (Proposal 22). The Report proposes that, where processors are acting on the instructions of a controller, they will have fewer compliance obligations under the Privacy Act. This is likely to be a welcome proposal to many businesses who currently struggle with implementing some of the existing APPs where there is no direct touchpoint with individuals. The Report suggests that processors would only be responsible for complying with APP 1 (open and transparent management of personal information), APP 11 (security of personal information) and the notifiable data breach scheme (albeit it is proposed that processors will only be required to notify the OAIC and the controller, not impacted individuals).
- The requirement to conduct Privacy Impact Assessments (Proposal 13). The Report proposes mandatory Privacy Impact Assessments (PIA) for any ‘high privacy risk activity’, which would encompass activities ‘likely to have a significant impact on the privacy of individuals’. In completing a PIA, an APP entity would be required to assess potential impacts on privacy, consider whether these are proportionate and may be required to mitigate these impacts. The Report proposes that the OAIC will publish guidance specifying factors that may be indicative of a high-risk activity to help APP entities understand when they need to complete a PIA.
- Regulation of targeted advertising (Proposal 20). The Report proposes prohibitions on the use of information related to an individual (including personal information, de-identified information, and unidentified information (such as internet tracking history)) for targeted advertising and content to children, and prohibitions on using sensitive information for targeted advertising and content to any individuals. Individuals have a right to opt-out of receiving targeted advertising and content, and any permitted targeting must be ‘fair and reasonable’ and come with transparency requirements about the use of algorithms and profiling to recommend content to individuals. These changes draw from regulation introduced by the European Commission last year under the Digital Services Act.
- Additional protections for children and vulnerable persons (Proposals 16 and 17). Several additional protections are proposed specifically in relation to children. These include codification of existing OAIC guidance on consent and capacity, requiring entities to make collection notices and privacy policies ‘clear and understandable’, and requiring entities to have regard to the best interests of the child in its consideration of the fair and reasonable test (see item 1 above). The Report also proposes developing a Children’s Online Privacy Code applicable to services that children are likely to access, which would be modelled on the UK’s Age Appropriate Design Code. The Report also proposes that where an activity may have a significant impact on vulnerable persons, this must be considered in the fair and reasonable test (see item 1 above), and a Privacy Impact Assessment must be performed. These proposed reforms may require organisations, depending on their business, to adopt different data handling practices across their customer base.
- Statutory tort of privacy (Proposal 27). The Report recommends the introduction of a statutory tort for serious invasions of privacy that are intentional or reckless. Importantly, the invasion of privacy need not cause actual damage and individuals may claim damages for emotional distress. The Report suggests that the OAIC should be able to appear as amicus curiae and intervene in proceedings with leave of the court for both the direct right of action under the Privacy Act and the tort for invasion of privacy. A statutory tort for invasion of privacy was proposed in the Australian Law Reform Commission’s 2014 Report ‘Serious Invasions of Privacy’ and then again in the ACCC’s 2019 ‘Digital Platforms Inquiry – Final Report’, without ever being implemented into law.
- Introduction of a right of erasure (Proposal 18.3). The Report proposes introducing a right of erasure that would provide individuals with the ability to request the deletion of their personal information by APP entities. This right of erasure is essentially an extension of the obligation to delete personal information once it is no longer required, and individuals will be able to exercise this right in relation to any category of personal information. The Report also proposes a right of de-indexation, which is surprising because the Discussion Paper seemed to reject this idea. This will allow individuals to require search engines to de-index online search results where the results are excessive in volume, inaccurate, out of date, incomplete, irrelevant or misleading. Search engines will also be required to de-index sensitive information and information about minors. Importantly, the Report recommends that these rights should be subject to exceptions where: there are competing public interests, it is required or authorised by law, it is technically infeasible or an abuse of process.
- Greater enforcement powers and penalties (Proposal 25). In addition to the enhanced penalties and expanded OAIC powers passed in December 2022, the Report proposes various measures to strengthen enforcement of the Privacy Act. In particular, it proposes new civil penalties and a slew of new powers for the OAIC in relation to investigations, public inquiries and determinations. The Report also proposes to amend section 13G of the Privacy Act (the civil penalty provision for “serious or repeated interference with privacy”) to provide more guidance on what amounts to a “serious interference”. The threshold for a “serious interference” has been softened, and may include interferences that involve “sensitive information” or other information of a sensitive nature, interferences adversely affecting large groups of individuals (likely reflecting cyber incident circumstances), or serious failures to take proper steps to protect personal information. This is significant because, following the December 2022 amendments, the maximum penalty under amended section 13G of the Privacy Act is $50million+.
The Government is seeking feedback on the Report from both the private and public sectors. The deadline for submitting feedback is 31 March 2023.
After the consultation period has closed, the Government will formally respond to the Report. We expect this response will indicate which of the 116 proposals will be implemented in amending legislation. After that, it is likely that an exposure draft of an amendment bill will be released but it is difficult to say at this stage how long that process will take given how long it took to get to where we are today.
Authors: Melissa Fai, Andrew Hii and Claire Harris