While all eyes have been on the recent introduction of the privacy reform Bill to Parliament, there have been a number of other updates that continue to inform the shifting patterns of opportunity, legal risks and regulatory focus in relation to cyber, privacy, AI and data over the last three months.
In addition to the more substantive updates below, also keep in mind:
Our Technology and Privacy specialists take you on a tour of the reforms in this article.
For those in a hurry, the Privacy and Other Legislation Amendment Bill 2024 (Cth) contains:
Certain aspects of the reforms will be more important to some organisations than others, so it’s important to carefully identify those that may impact your business and operations.
However, at a minimum, businesses should be aware of the specific Australian Privacy Principle (APP) provisions that are proposed to be subject to the OAIC’s “infringement notices” power (see our earlier article for a list). Compliance with these provisions should be an area of focus, given the relative ease with which the OAIC will be able to take action in the event of non-compliance (if reforms are passed).
There is a raft of other recommended changes proposed through the reform process which were not included in this Bill. These reforms may be introduced at a later date.
Following the Government’s previous announcement on its proposed risk-based approach to regulating AI (which we reported on in an earlier edition of Digital Bytes), on 5 September 2024, the Department of Industry, Science and Resources has issued a proposals paper for introducing mandatory guardrails for AI in high-risk settings containing:
Submissions on the proposals paper are due by 4 October 2024.
The paper proposes two categories that should be considered “high-risk”, where the proposed mandatory guardrails will apply.
The first category addresses where AI use is known or foreseeable, and proposes that what is “high-risk” will depend on the adverse impact (and severity and extent of the impact) on:
The second category deems any advanced and highly capable AI models, where all possible risks and applications cannot be predicted, to be “high-risk”.
The proposed mandatory guardrails are:
The proposals paper canvasses and seeks feedback on the following three options for implementing the proposed mandatory guardrails:
Organisations developing or deploying AI in “high-risk” settings may wish to submit feedback on the proposals paper by 4 October 2024.
On the same day as the release of the ‘mandatory guardrails’ paper described above, the Department of Industry, Science and Resources issued a Voluntary AI Safety Standard setting out 10 voluntary guardrails to help organisations deploying and developing AI to benefit from and manage the risks associated with AI.
The intended audience is both developers and deployers of AI.
The 10 voluntary guardrails are the same as the proposed mandatory guardrails, set out above, with the exception of the 10th guardrail, which is:
Engage your stakeholders and evaluate their needs and circumstances, with a focus on safety, diversity, inclusion and fairness.
Organisations already developing or deploying AI should consider adopting the 10 steps in the AI Voluntary Safety Standards – even though this is not currently a legal imperative, adopting this approach will assist with risk management and may become industry practice.
The Governance Institute of Australia (GIA) has released an issues paper on artificial intelligence (AI) and board minutes, addressing the growing use of transcription and generative AI tools to transcribe meetings and generate action items or summaries.
The issues paper flags confidentiality, cybersecurity, IP, inaccuracy and lack of transparency as key legal issues that may arise. It also highlights the importance of technological literacy of those using AI.
In light of directors’ statutory and common law fiduciary duties, the issues paper recommends directors ensure that any AI-generated minutes are a true reflection of board meetings.
To assist boards to navigate a disruptive technological trend, the Australian Institute of Company Directors (AICD) has released a suite of guidance materials for directors on AI governance, focusing on generative AI. ‘A Director’s Introduction to AI’ provides an overview of AI applications and relevance for directors, the risks and opportunities, and the applicable domestic and international regulatory environment.
Like the GIA AI issues paper, the AICD materials urge directors to be mindful of their duties when capitalising on the commercial benefits of generative AI in their organisations.
Generative AI is also raising competition law concerns. Competition and consumer law regulators in the United States, the European Union and United Kingdom have released a joint statement identifying trends in the AI market which they consider may impact a fair, open and competitive environment, and the following competition and consumer risks:
AI’s impact on anti-competitive behaviour and detrimental outcomes to consumers will continue to be monitored by competition and consumer law regulators in these jurisdictions. In Australia, the Australian Competition and Consumer Commission (ACCC) recently released an announcement flagging competition issues in generative AI as a topic to be addressed in its 10th Digital Platform Services Inquiry report – so the issue is equally on the radar in Australia.
In August 2024, key Australian regulators released their corporate plans for 2024-25, identifying their areas of focus for the year ahead. Rapid technological innovation was cited across the board as one of the driving factors impacting the regulators’ respective sectors and informing their strategic priorities.
The OAIC has outlined its focus on identifying the unseen harms that impact privacy rights in the digital environment. As part of this focus, it plans to implement a program of targeted, proactive investigations to uncover harms, provide avenues for remediation and set the standard for industry practice. It also flagged that it is looking to exercise its wider range of enforcement powers, which have been proposed through the privacy reforms.
The OAIC also has a new role in regulating the ‘Digital ID’ scheme, and has flagged that it is looking to increase the uptake of digital ID use in order to reduce avoidable over-sharing of identity information.
The OAIC states it is aiming to finalise 80 per cent of notifiable data breaches within 60 days, and 80 per cent of privacy complaints within 12 months.
ACMA has also released its annual compliance priorities for 2024-25, which include addressing misleading spam messages, and combatting misinformation and disinformation on digital platforms (note the new Bill referred to above).
Both the Australian Securities and Investments Commission (ASIC) and the Australian Prudential Regulation Authority (APRA) named cyber resilience as a key focus for 2024-25.
In its corporate plan, ASIC stated that it intends to advance digital and data resilience and safety by:
APRA plans to undertake a number of regulatory activities aimed at strengthening the cyber risk-management practices of regulated entities, including:
The OAIC’s notifiable data breaches report for January to June 2024 was published on 16 September 2024. In the report’s foreword, the Australian Privacy Commissioner reminds entities that the scheme is now six years old, and “it is no longer acceptable for privacy to be an afterthought; entities need to be taking a privacy-centric approach in everything they do”.
The number of notifications received in this six-month period was the highest it has been since late 2020, with 527 notifications. Malicious or criminal attacks still make up the majority (67 per cent) of notified data breaches, with human error accounting for 30 per cent and system fault a mere 3 per cent. Incidents involving phishing (compromised credentials), ransomware and other compromised or stolen credentials make up the majority of reported cyber incidents.
Messages of note in the report include:
A recent privacy determination tests the limits of the employee record exemption in the Privacy Act 1988 (Cth) (Privacy Act).
In ALI and ALJ (Privacy) [2024] AICmr 131, an employee made a complaint after 110 staff were emailed an update about the employee’s (good) recovery following a medical episode in the workplace’s carpark which was witnessed by a number of other employees.
The employer argued that disclosing the employee’s personal and sensitive information in the update fell within the employee record exemption because the update was directly related to the employment relationship. However, the employer’s argument focused on its employment relationship with other employees who were concerned with the complainant employee’s recovery after the incident. As such, the OAIC was not persuaded that the update was directly related to the employer’s employment relationship with the complainant employee.
The OAIC then found that use of the employee’s personal information in the update breached APP 6, because the employer could have discharged its duty to its other employees without identifying the employee by name in the update.
The OAIC awarded the employee $3,000 for non-economic loss and $125 for expenses. The OAIC declined to award other remedies sought by the employee, such as a charitable donation or an employment reference.
The OAIC has recently conducted a privacy assessment of Australian Digital Health Agency’s (ADHA’s) 'my health app', including a review of its privacy policy.
Notably, the OAIC’s assessment included consideration of how the app’s privacy policy addressed overseas disclosure. It recommended that catch-all statements intended to “allow for situational responsiveness and to avoid breaching the policy” should be replaced with a more detailed and specific description of any overseas disclosure based on current practice (if there were any such disclosures).
Further, the OAIC noted that the privacy policy was lengthy, repetitive, and included operational and instructional information not relevant to the management of personal information. It recommended that the privacy policy should only include descriptions of how the entity manages personal information. The OAIC also repeated its general guidance that privacy policies should be easy to understand (for example, by avoiding jargon and legalistic terms).
Organisations should consider reviewing their privacy policies against these recommendations.
In 2021, the OAIC found Clearview AI had breached Australians’ privacy through the collection of images without consent, and ordered the company to cease collecting the images and delete images on record within 90 days. Clearview initially appealed the decision to the Administrative Appeals Tribunal but ceased its appeal in August 2023. The OAIC recently announced that further action against Clearview AI was not warranted.
Further, despite raising concerns about TikTok’s use of pixel technology, the Australian Privacy Commissioner has declined to investigate, citing deficiencies with existing privacy laws. Given the recent privacy reform Bill does not include amendments to the definition of personal information, it is possible that further reforms are required to investigate the practice.
In response to a surge of regulatory activity under the Spam Act 2003 (Cth) (Spam Act) and the Do Not Call Register Act 2006 (Cth), in July 2024, ACMA released its Statement of Expectations (Statement). This ‘outcome-focused guide’ establishes ACMA’s expectations of how businesses should obtain consumer consent when conducting telemarking calls and e-marketing (via email, SMS and instant messages).
The key takeaways from the Statement are:
The Statement also reinforces the existing legal requirements in relation to unsubscribe and opt-out options, including the fact that individuals should not be required to log in to a service to unsubscribe.
The release of this Statement indicates that practices regarding consent are on ACMA’s radar, and organisations should consider reviewing their practices against ACMA’s expectations in the Statement.
The Consumer Data Right (CDR) regime is Australia’s data portability scheme. Introduced in 2019, the scheme has been rolled out sector by sector – so far, to banking and energy sectors – to allow consumers to direct their service providers (e.g. their bank) to provide their data directly to recipients accredited under the scheme (e.g. a budgeting app).
In a significant update to the scheme, legislation (originally introduced to Parliament in 2022) has recently passed which permits “action initiation”. Action initiation allows an accredited data recipient to take actions on the consumer’s behalf. For example, an accredited recipient (with the consumer’s consent) may be able to make payments, open and close accounts, switch providers and update details on the consumer’s behalf.
Action initiation will only be available for types of actions designated by the Minister, in relation to service providers designated as “action services providers” by the Minister.
Treasury also released, for public consultation, exposure draft amendments to the Consumer Data Right Rules which include (among other changes) proposals to simplify:
Submissions have now closed.
Straight from recent headlines, the Government announced that it is consulting on a proposal to impose social media age restrictions – we examined the proposals in this recent article.
Further, Australia’s eSafety Commissioner has recently issued new industry standards, commenced development of the next phase of industry codes, and issued a number of notices to digital platforms to report and provide information about measures being taken:
APRA’s Prudential Standard CPS 230 Operational Risk Management (CPS 230) sits within the risk-management pillar of APRA’s framework. Operational risk management is essential to ensure the resilience of an entity and its ability to maintain critical operations through disruptions.
Our earlier edition of Digital Bytes canvassed CPS 230’s requirements, which take effect on 1 July 2025. CPS 230 sets baseline expectations for all APRA-regulated entities. Each regulated entity has operational risks, however APRA expects Significant Financial Institutions (SFIs) to have stronger practices to complement the size and complexity of their operations.
In July 2024, APRA released the final version of Prudential Practice Guide CPG 230 along with an accompanying statement setting out responses to submissions made through earlier consultation.
Of particular note, APRA has:
APRA-regulated entities should also be aware that APRA has published a letter to its regulated entities providing additional insights on common cyber resilience weaknesses.
As reported in our recent Above Board publication, the eight observations in APRA’s letter relate to security in configuration management, privileged access management and security testing. These include “inadequate management and oversight of security test findings”; APRA’s guidance is that test results should be reported to the appropriate governing body or individual, with associated follow-up actions formally tracked. Testing, like threat detection, only works if it is followed through.
Some of the updates we can expect in the coming months include:
Finally, if you’re currently focused on what you can do to minimise the aged and redundant personal information you hold, a recent case in the US on Google’s destruction of employee chat records is a timely reminder to ensure that you also take the right steps to preserve evidence.
We have a large team of privacy and cyber specialists, with substantial experience across the whole spectrum of data, privacy and cyber compliance and incident management.
For a more detailed briefing on any of these updates, or to discuss how we can assist your organisation to manage its risks in these rapidly evolving areas, please get in touch.
Big thanks to Alexandra Gauci, Bailey Britt, Dean Baker, James Finnimore, Leonie Higgins, Caitlin Abernethy and Saara Stenberg for their contributions to this edition of Digital Bytes.
This week marks a significant development in Australia’s privacy law reform process, which is likely to result in some changes becoming law before the next federal election.
The European Commission recently fined a large global pharmaceutical company €462.6 million for abusing its dominant position to lessen competition in the market for the supply of Copaxone...
The past year has undoubtedly been challenging for companies in the lithium, rare earth and critical minerals sectors. To provide some context, lithium carbonate, lithium hydroxide and spodumene...