Our recent article, ‘Internet, privacy and data – a year in review’, is an unmissable bumper edition of the privacy, cyber security, online safety and AI highlights from 2024 – most of which occurred in the last quarter of 2024.
Anyone doing business in Australia should make sure they are across these updates, and that changes to the privacy regime and upcoming mandatory ransomware reporting obligations are being reflected in updates to the appropriate policies, processes and procedures.
This special edition of Digital Bytes focuses on developments in the privacy and credit reporting space that you may have missed over the holiday period.
Findings from the OAIC’s facial recognition investigation released
Following a lengthy investigation, Australia’s privacy regulator, the Office of the Australian Information Commissioner (OAIC), released its determination that Bunnings’ use of facial recognition technology did not comply with the Privacy Act 1988 (Cth) (Privacy Act).
Notably, the OAIC found:
- Bunnings had ‘collected’ a record of the personal information of all individuals captured on its CCTV system, even though that record was held for (on average) 4.17 milliseconds before being automatically destroyed (if it could not be matched to Bunnings’ list of persons of interest). The fact that personal information was only reviewed by technology (to conduct a matching process), and was held for such a short period, did not mean that it was not ‘collected’ by Bunnings.
- Bunnings did not comply with the requirements regarding collection of sensitive information as it did not obtain individuals’ consent and exceptions did not apply. Bunnings had implemented the system in order to reduce retail theft and crime, and to protect its employees. The OAIC found that the facial recognition system was not ‘necessary’ to lessen or prevent a serious threat to life, health or safety, or to take appropriate action in relation to unlawful activity or serious misconduct, and had disproportionately interfered with customers’ privacy.
- Bunnings’ privacy policy and collection notices did not provide sufficient information about how Bunnings’ facial recognition system handled personal information (especially personal information of those who were matched to Bunnings’ list of persons of interest and whose personal information collected through the system was retained and used).
Bunnings was ordered not to continue the acts or practices that breached the Privacy Act, to publish a statement about the determination, and to destroy all information handled through the facial recognition system 12 months after publishing the statement. It likely avoided more serious orders by being cooperative throughout the OAIC’s investigation.
Bunnings has announced that it is seeking a review of the determination before the Administrative Review Tribunal.
The OAIC has released a privacy guide for businesses considering the use of facial recognition in commercial and retail settings, which emphasises the importance of addressing:
- necessity and proportionality – handling the minimum amount of personal information for a legitimate purpose by the least privacy-intrusive means;
- consent and transparency;
- accuracy, bias and discrimination; and
- governance and ongoing assurance – making sure practices are underpinned by clear privacy governance arrangements which are regularly audited and reviewed.
Beyond facial recognition, the Bunnings determination demonstrates the importance of ensuring that privacy documentation – especially privacy policies and collection notices – are up to date and comprehensive. While they may not be regularly scrutinised by individuals day-to-day, if a potential non-compliance arises, they are thrown into the spotlight and interrogated. The OAIC recommends updating privacy policies at least annually.
Another cautionary tale for public data collection practices
Only nine months after the last determination on the collection of publicly available personal information (which we reported on in our June 2024 edition of Digital Bytes), the OAIC issued another two determinations traversing similar ground.
In November 2024, the OAIC issued determinations against two related companies who had cross-matched publicly available personal information (such as from court lists and death and funeral notices) with property information available from the Core Logic Asia database, to distribute a list of ‘distressed properties’ (lead lists) to their paying customers. After preliminary inquiries were made by the OAIC, identifying information was omitted from the lead lists, but detailed descriptions of how to independently locate the relevant person’s identifying information was provided to customers.
The OAIC found that:
- the companies’ collection of personal information was unfair (in breach of APP 3.5) – in particular, collection of personal information in this way breached the relevant source website’s terms of use, and in circumstances where the relevant individuals had no knowledge or awareness of the collection (and were in or perceived to be in vulnerable positions);
- the companies failed to take reasonable steps to notify the relevant individuals about the collection of their information (in breach of APP 5.1); and
- the companies did not take steps to ensure the personal information was accurate, up-to-date, complete and relevant for the purposes of the use and disclosure (in breach of APP 10.2).
It also found that neither company had a privacy policy that complied with the requirements of the Privacy Act.
The determinations confirm that the collection of personal information from publicly available sources remains an area of concern for the OAIC. Organisations are now well and truly on notice about the privacy considerations that must be addressed as part of this practice.
The determinations are Commissioner Initiated Investigation into Master Wealth Control Pty Ltd t/a DG Institute (Privacy) [2024] AICmr 243 and Commissioner Initiated Investigation into Property Lovers Pty Ltd (Privacy) [2024] AICmr 249.
The OAIC reviews public complaints management
Another recent privacy determination – ‘AQE’ and Noonan Real Estate Agency Pty Ltd (Privacy) [2024] AICmr 237 – reminds organisations to be careful about publishing personal information online.
The OAIC found that a real estate business that posted a person’s name, occupation and some financial information in the course of responding to a negative Google review disclosed personal information in breach of APP 6. In particular, the OAIC found that the use of personal information was unrelated to the primary purpose of collection, and that use to “protect [the business’] integrity” was not a permitted secondary purpose – especially when the business could have responded without identifying the person.
The OAIC’s guidance indicates that publishing information online will constitute an overseas disclosure to which APP 8 applies (where that information is accessible to an overseas recipient). However, the fact that APP 8 was not considered as part of this determination suggests that the OAIC is less likely to pursue APP 8 enforcement action in the context of online publications – possibly as it would require the OAIC to prove that a person overseas had accessed the relevant information.
Organisation’s liability for employee privacy breach explored in OAIC determination
In January 2025, the OAIC released its determination in 'ATE' and 'ATF' (Privacy) [2025] AICmr 10, which considered whether an organisation was liable for unauthorised disclosure of personal information in breach of APP 6, when its employee sent personal information about its customer to a journalist, in breach of the organisation’s policies and procedures.
The OAIC confirmed that an organisation will be directly liable where an employee ‘direct[s the] mind and will’ of the organisation. The OAIC found that where the employee disclosed personal information to the media in breach of the organisation’s media contact policy, the employee was not directing the mind and will of the organisation.
The OAIC also confirmed that for the organisation to be vicariously liable, the employee must be acting in the performance of the duties of their employment. This may encompass unauthorised, intentional or criminal acts, provided there is sufficient connection between the act and the employee’s ordinary employment duties. The OAIC found that in this case, the employee’s duties did not include, and were insufficiently connected to, contacting the media. The OAIC also indicated that the employer’s intention to dismiss the employee for gross misconduct (had the employee not resigned) ‘supports the argument’ that the employee’s acts were not in the performance of their duties.
While direct and various liability will be a pertinent consideration when assessing non-compliance with the Australian Privacy Principles, organisations should note that liability for non-compliance is not necessarily a precursor to the notifiable data breach regimes, so unauthorised disclosure (for which the employer is not liable) may still trigger notification obligations.
Tort of privacy developments
While all eyes have been on the passage of the new tort of privacy as part of the raft of privacy reforms which became law on 11 December 2024 (you can read more about this in our earlier article), a County Court of Victoria decision has quietly attempted to steal its thunder by deciding that a tort of privacy already exists at common law.
In the decision of Waller (A Pseudonym) v Barrett (A Pseudonym) [2024] VCC 962, the Court found that a remedy in tort should be available where private matters are made public “in circumstances that a reasonable person, standing in the shoes of the claimant would regard as highly offensive”. The Court did not determine the elements of the cause of action, or the available defences.
As entertaining as such a novel decision is, it is unclear whether it will be followed and, in any event, it is likely to be subsumed by the statutory tort (which will take effect on a date to be confirmed but no later than 11 June 2025).
As we’ve noted in previous updates (including our article from September 2024), the new statutory tort provides that an individual will have a cause of action in tort against another person for certain serious invasions of privacy, that are intentional or reckless, where the individual would have a reasonable expectation of privacy in the circumstances. The tort will apply to invasions of privacy in the following circumstances:
intruding upon the plaintiff’s seclusion; or
(meaning to physically intrude into the person’s private space or watching, listening to or recording the person’s private activities or private affairs)
misusing information that relates to the plaintiff.
(including, but not limited to, collecting, using or disclosing information about the individual, whether true or not)
In certain circumstances, a public interest balancing test and certain defences will be available.
Changes to the credit reporting regime in Australia
Late September and early October 2024 was a busy period in the credit reporting regulatory space.
On 30 September 2024, an independent report into Australia’s consumer credit reporting framework was released. The next day, the OAIC registered a new Credit Reporting Code (CR Code) under the existing laws, most of which took effect immediately.
For background, Australia’s consumer credit reporting framework is housed in thePrivacy Act and the National Consumer Credit Protection Act 2009 (Cth) (Credit Act), which together regulate how consumer credit providers conduct credit checks and provide credit reporting information to credit reporting bodies, and how credit reporting bodies hold and manage that information.
The independent review report made 37 recommendations, including the consolidation of the two legislative regimes into a streamlined Act. A number of the recommendations aim to:
- make the scheme more accessible to consumers;
- modernise the categories of credit data that are subject to the regime;
- update the processes for correction of information, implementing credit report bans and protecting against fraud;
- facilitate data sharing within the credit reporting ecosystem, for example by prohibiting exclusive arrangements between credit reporting bodies and credit providers; and
- increase oversight activity by the regulator, the OAIC.
The Government is considering the review report and is expected to conduct further consultation on the recommendations.
Key features of the new Code include:
- changes to the credit report ban request regime, including a free alert system to notify a victim of fraud who has requested a ban (including following a data breach exposing them to potential fraud) if someone has attempted to access credit during the ban period;
- updates to the correction regime to allow more streamlined correction of information following a fraud event, and to recognise domestic violence as being circumstances that may warrant correction of information; and
- increased transparency about how credit reporting bodies and credit providers comply with their obligations, including additional matters to be addressed in collection notices.
Credit providers should update their documentation, processes and procedures to address the changes to the CR Code, and should monitor for opportunities to have a say on broader changes to the consumer credit reporting framework.
Know what to expect when it comes to unexpected use of customer data
Two recent proposed class actions in the United States, against Apple and LinkedIn, demonstrate the legal risks involved in unexpected use of customer data, including for training AI models.
Apple
In the first week of January 2025, Apple agreed to pay US$95 million to settle a proposed class action which alleged that Apple had violated the plaintiffs’ privacy by recording their conversations via ‘Siri’-enabled devices and disclosed this data to third parties, including advertisers.
In 2019, The Guardian reported that Apple had been collecting and sharing recordings of the conversations of users who has activated the Siri function. Some of the plaintiffs reportedly claimed, for example, that after mentioning ‘Air Jordan’ sneakers and ‘Olive Garden’ restaurants while Siri was unintentionally activated, the plaintiffs received targeted advertising for these products. Another plaintiff claimed that they received targeted advertising for a surgical treatment after they had (what they thought was) a private conversation with their doctor about such treatment.
Apple denies these claims and, in a statement published on 9 January 2025, states that it “does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri, and even then, the recordings are used solely for that purpose”.
In a class action filed in January 2025 on behalf of LinkedIn Premium users, it is alleged that LinkedIn shared users’ private messages with other companies to train AI models. Further, the proposed class action alleges that, in August 2024, LinkedIn “quietly” introduced a privacy setting that allowed users to enable or disable sharing of their personal data (which was allegedly automatically enabled), and then, in September 2024, “discretely” updated its privacy policy to state that data could be used to train AI models.
It is also alleged that these actions “indicate a pattern of [LinkedIn] attempting to cover its tracks … [which] suggests that LinkedIn was fully aware that it had violated its contractual promises and privacy standards and aimed to minimise public scrutiny".
LinkedIn denies the claims, stating they are “false” and “with no merit”.
Key takeaways
These examples highlight the importance of assessing community expectations about personal information handling and reputational impact in addition to compliance with privacy obligations.
When updating privacy documentation, organisations should ensure that important changes are brought to the attention of customers and users in advance of them being implemented.
When considering changes to personal information handling, organisations should strongly consider conducting a privacy impact assessment (PIA) to identify how the organisation can manage, minimise or eliminate potential privacy impacts. While conducting a PIA is currently best practice for organisations, the second tranche of Privacy Act reforms may mandate that organisations conduct a PIA prior to undertaking high privacy risk activities.
Finding against the European Commission puts overseas disclosure in the spotlight
On 8 January 2025, the European General Court ruled that the European Commission had unlawfully transferred an individual’s data (namely, their individual IP address) to the United States without adequate protections, by allowing the individual to sign into a conference booking portal through the Facebook sign-in option.
There are limited bases on which an organisation bound by the GDPR (or its equivalent for European public authorities, such as the European Commission) may transfer personal information to an overseas recipient. The European General Court did not consider whether the individual had consented to the transfer, but found that no other valid basis for overseas transfer applied.
In particular, at the time of the transfer of the individual’s data, there was no ‘adequacy decision’ for the United States and Facebook’s general terms and conditions (which applied to the Facebook sign-in option) were not considered to satisfy the requirement to provide appropriate safeguards.
The European Commission was ordered to pay the individual €400.
This decision demonstrates the challenges involved in handling personal information (and complying with the requirements for overseas disclosure) in the online environment – even regulators can make mistakes.
It is unclear whether similar findings would be made under Australian privacy laws for a number of reasons, including the following.
- Under EU privacy laws, the nature of an IP address (in isolation) would usually be considered personal information, however an IP address (in isolation) would not usually be considered personal information under current Australian privacy laws (unless the IP address can be linked to an identifiable individual, in which case it will be personal information).
- Under Australian privacy laws, cross-border transfers of personal information are permitted where the individual has provided consent. It could be argued, for example, that where an individual chooses to sign in via Facebook, they have consented to disclosure of their personal information for the purposes of such functionality. However, as in the EU, the bar for establishing consent to overseas transfer of personal information is high, and the mere fact that the functionality is optional may not be sufficient.
- It is possible that the typical privacy-conscious individual in Australia would be less likely to consider this form of data handling sufficiently egregious so as to warrant a privacy complaint.
Your cyber, privacy, AI and data new year’s resolutions
Our suggestion is that your organisation’s cyber, privacy, AI and data new year’s resolutions should include:
- refreshing any public-facing privacy documentation that has not been reviewed in the past 12 months, both to consider accuracy and completeness, and to review what commitments are made (given the rise in privacy litigation based on such commitments);
- updating privacy manuals, training materials and similar documents and processes to reflect the latest updates to the Privacy Act, including the OAIC’s enforcement powers;
- updating data breach response plans (and similar) to address ransomware payment reporting requirements (for commencement on 30 May 2025);
- stress testing cyber incident response plans and playbooks (through immersive simulations or table top exercises) to ensure they are as practical as possible should a cyber incident occur;
- mapping out how AI is used, and may be used, in your organisation and considering whether the 10 voluntary AI guardrails in the Voluntary AI Safety Standard should be adopted;
- auditing the effectiveness of personal information destruction/de-identification processes and procedures that are intended to address over-retention of information;
- reviewing the data holding arrangements with your key suppliers. Where contracts have ended, confirming that the supplier has destroyed/de-identified personal information it held on your behalf where legally permissible; and
- given the recent spate of privacy determinations, identifying any collection of personal information from publicly available sources or data brokers, and conducting a PIA (if one has not been conducted already).
How can we assist
We have a large team of privacy and cyber specialists, with substantial experience across the whole spectrum of data, privacy and cyber compliance and incident management.
For a more detailed briefing on any of these updates, or to discuss how we can assist your organisation to manage its risks in these rapidly evolving areas, please get in touch.
Many thanks to Henry Bakker for his contribution to this edition of Digital Bytes.