In line with recent global trends, online platforms have come under scrutiny by Australian regulators and legislators from numerous angles ranging from market-power and competition to consumer rights and privacy issues.
Following the introduction of the German NetzDG laws and UK’s introduction of the Online Safety Bill, Australia has introduced its own online safety legislative reform with the introduction of the Online Safety Act 2021 (Act), which comes into effect in January 2022 (6 months after receiving royal assent on 23 July 2021). The Act will expand the legal responsibilities of a range of online service providers for the content they host, including in respect of cyberbullying, harassment, prohibited content, and violent material.
Online service providers face a range of legal risks in respect of user consent, most of which are not new. Notably, these risks include the potential liability of such providers for user content that is defamatory, copyright infringing, or in breach of other laws regulating offensive content. The challenge of mitigating these risks is not straight-forward given the already gigantic, and exponentially growing, volume of user content.
Although the Act appears primarily targeted at regulating BigTech social media platforms, it will raise concerns for a far broader range of online service providers. The Act is stated to apply to providers of “social media services”, “relevant electronic services”, and “internet service providers”. These concepts are broadly defined – for instance, “relevant electronic services” includes email, instant messaging or any form of electronic chat, SMS, MMS, and multiplayer online gaming. Further, the Act extends to service providers that facilitate online activity and content, such as cloud infrastructure providers, and internet service providers, who often do not have visibility of, or control over the specific content which may be regulated.
Online service providers operating in Australia will need to reassess their user content policies, legal terms, and systems and processes used to monitor and moderate user content to ensure they are in a position to comply with the Act.
Prior to the Act taking effect, a patchwork of existing legislation regulates online content, including:
This existing legislative framework imposes (often overlapping) obligations on online service providers that host, deliver, publish or facilitate distribution or communication of, certain types of content such as age classified content, ‘intimate images’, ‘abhorrent violent material’, and content that constitutes cyber-bullying. Such obligations include:
The eSafety Commissioner (the Commissioner), an independent statutory office supported by the Australian Communications and Media Authority (ACMA), is the primary regulator of this legislative framework. The Commissioner may impose penalties for non-compliance, which range depending on the specific breach, and its severity and duration, but can be significant (e.g. the greater of AUD 11.1 million or 10% of annual turnover for failures to remove abhorrent violent material).
The Act consolidates and expands on existing legislative frameworks, including by:
The Act also extends the Commissioner’s investigative powers to obtain identifying information from relevant providers about anonymous accounts and to ‘do all things necessary or convenient’ to be able to perform the Commissioner’s functions. Non-compliance with BOSE will attract civil penalties and the Commissioner may name providers that do not meet the expectations while also publishing statements of compliance for those that do.
The Department of Infrastructure, Transport, Regional Development and Communications is conducting the consultation on the BOSE and has released a draft BOSE determination for public submissions. The consultation is due to close on 15 October 2021 and due to be implemented in June 2022.
The BOSE is principle-based and are drafted broadly to reflect the policy intention to provide ‘flexibility’ on how service providers are to meet these expectations. In addition to the core expectations, there are ‘additional expectations’, which provide further clarity as to what is expected of the providers. The BOSE will apply to providers of social media services, ‘relevant electronic services’ and ‘designated internet services’.
Broadly, the draft determination is set out in three parts which cover expectations regarding safe use, reporting and complaint mechanisms, and providing information accessible to end-users and the Commissioner. Notably, under the proposed BOSE, online service providers will be required to:
The Act and the draft BOSE has given rise to concerns from a variety of industry participants regarding its far-reaching scope and numerous practical problems in achieving compliance. For example:
At this stage, it is not clear how such issues will be overcome across the vast array of online service providers that are likely to be subject to the Act. In many cases, compliance will likely require significant investment into processes and automation technology to monitor and respond to harmful content.
Online service providers should consider: