Intermediate Steps for Digital Intermediaries: The Stage 2 Defamation Reforms
27 June 2024
27 June 2024
The long awaited second tranche of reforms to the uniform defamation legislation is finally due to come into effect. Each state and territory has agreed to implement reforms to the model defamation provisions, excluding South Australia, which will not be implementing Part A of the Reforms.
These reforms follow the Stage 1 reforms that took effect in July 2021 (except in the Northern Territory and Western Australia, which are still yet to implement the reforms). The Standing Council of Attorney-General (SCAG) have agreed to review the implementation of both the Stage 1 and 2 reforms three years after the commencement of Stage 2, in 2027.
There are two parts to the planned reforms: Part A concerns internet intermediary liability in defamation for the publication of third-party content and Part B relates to absolute privilege.
The Part A reforms appear to be a step in the right direction towards ensuring that digital intermediaries are not held liable for third party publications over which they had no input or control. This is likely to slow the trend seen in recent years of plaintiffs pursuing the digital intermediary whose services were used to make the defamatory material available, rather than the individual who originated the defamatory material.
The reforms do not perhaps go so far as some had hoped in dealing with the "Voller problem": while the digital intermediaries now have protection, there is no such protection for the individual who made the original post or who originated the thread on which the third party defamatory comment was made. Even if the amendments had been available at the time of Voller, the defendants in that proceeding (Fairfax Media, Nationwide News and Australian News Channel) would not have been able to avail themselves of the new defence or exemptions in relation to the third party comments made on their Facebook posts.
Accordingly, concerns will remain for users of online services that they may be liable for defamation for publications by third parties – for example, an individual who opens a forum thread to discuss a topic, or a community organisation that operates a Facebook page, may each be held liable for posts by third parties within the thread or on the page that they have created. It appears that affected users may have to wait until 2027 to see reform in this area.
There are also deep concerns that in order to avail themselves of the new innocent dissemination defence, digital intermediaries will essentially be required to take access prevention steps in relation to the allegedly defamatory material (i.e. removing or blocking access), regardless of whether or not the material is actually defamatory. This is likely to lead to excessive censorship of material, if individuals know that they can get something removed merely by alleging it is defamatory. This result seems to be contrary to the purported aims of the reforms. Whether these concerns come to fruition will remain to be seen until there is judicial consideration of the new section.
The Part B reforms are also a step towards providing victims of crime with protections to ensure they do not become embroiled in defamation proceedings as a result of reporting a crime to authorities. However, more could be done to ensure victims who report to other support services are protected, or that victims have access to commensurate safeguards as received in criminal proceedings (e.g. in relation to giving testimony via video link).
As noted above, each state and territory has agreed to enact legislation to implement the Part B reforms and all but South Australia have agreed to Part A. At the time of writing, only New South Wales, Victoria and the ACT had introduced legislation to enact the reforms.
What was previously the uniform defamation legislation is becoming less uniform with each iteration of reform:
It remains to be seen whether the Northern Territory and Western Australia will implement this round of reforms, despite ostensibly agreeing to implement the changes.
The inconsistencies in relation to the 2021 reforms have already led to forum shopping by plaintiffs, and this seems likely to continue unless the states and territories will agree to return to true uniformity.
We set out below a summary of the amendments and their potential impacts.
Part A aims to reform the model laws to strike a better balance between protecting individuals' reputations and not unreasonably limiting freedom of expression in circumstances where third parties publish defamatory matter via internet intermediaries.
These reforms are in part a response to the decision in Fairfax Media Publications Pty Ltd v Voller (2021) 273 CLR 346, where media companies were held liable as the publishers of third party comments on their Facebook pages responding to news stories they posted. This decision caused serious concerns for media companies and internet intermediaries.
Ultimately, these reforms aim to clarify liability for defamatory content and the onus on these companies to remove potential defamatory material.
One of the most significant amendments is that a new defence to defamation will apply for digital intermediaries for the publication of defamatory matters under the new section 31A.
This defence applies if:
a) the defendant is a digital intermediary;
b) the defendant has an accessible complaints mechanism for the plaintiff to use to complain about defamatory content; and
c) if a written complaint was made, then the defendant has taken reasonable access prevention steps either before the complaint was made or within seven days after the complaint was made.
This defence introduces a number of new concepts into the defamation legislation:
Concept | Definition |
Digital intermediary | A person/corporation, other than an author, originator or poster of the matter, who provides or administers the online service by means of which the matter is published |
Online service | A service provided to a person to enable the person to use the internet, for example, a social media platform, forum or website. |
Accessible complaints mechanism | An easily accessible address, location or other mechanism available for a potential plaintiff to use to complain to the digital intermediary about the publication of digital matter. An example of this would an email address or a part of webpage that enables details about a complaint to be submitted. |
Written complaint | The written complaint must contain information sufficient to enable a reasonable person in the defendant's circumstances to be: a) made aware of the name of plaintiff, the matter and location of the content; and b) that the plaintiff considered the matter to be defamatory and the complaint must be given using an accessible complaints mechanism. |
Access prevention step | A step to remove the matter or to block, disable or otherwise prevent access to the matter, whether by some or all persons. |
To take advantage of the new defence, the digital intermediary must show that the access prevention steps were reasonable for the intermediary to take in the circumstances and, where the steps were taken by another person, it must have been reasonable for the digital intermediary not to take steps because of the steps already taken (e.g. the originating user having hidden the post).
The limited time period in which a digital intermediary is expected to implement the access prevention steps places pressure on the internet intermediary to take action. It might not be clear within that period, if ever, that the digital matter is defamatory, and whether any defences would be available in relation to the material (e.g. truth, honest opinion, etc). It is unlikely that a digital intermediary will ever be well placed to make this determination as to whether the material is defamatory and/or defensible.
The defence as framed requires intermediaries to take action about potentially defamatory content without the knowledge that these steps are actually necessary. This framing of the defence places a high burden on internet intermediaries, essentially requiring them to take removal steps for any material about which they receive a complaint.
Despite the stated intentions of this round of reform, this amendment is likely to lead to excessive removal of content online, preventing users from freely expressing themselves and have open online discourse on online forums (particularly online review forums) that are designed for honesty and anonymity. It is likely that since intermediaries lose the protection of the defence if they do not comply with a complaint, they will just remove or prevent access to material whether or not it is defamatory, resulting in unnecessary censorship.
The reforms create new exemptions from liability of defamation for two narrow categories of digital intermediaries, specifically search engine providers and digital intermediaries providing caching, conduit or storage services.
Helpfully, and perhaps acknowledging the historical reluctance of the Federal Court to make determinations of questions earlier in defamation proceedings, the amendments also include the new section 10E, which states that the determination of whether one of the exemptions discussed below applies should happen as early as possible in hearings (and not wait for the final hearing).
Section 10D of the amendments introduces an exemption for search engine providers from defamation liability. This exemption:
a) requires the provider to prove their role was limited to providing an automated process for the user of the search engine to generate the results;
b) applies only to search results generated by the search engine limited to identifying a webpage on which content is located by reference to one or more of the title, hyperlink, extract or image from the webpage; and
c) excludes search results to the extent the results are promoted or prioritised by the search engine provider because of payment given to the provider on behalf of the third party.
This amendment confirmed part of the decision in Google LLC v Defteros (2022) 403 ALR 403 (Defteros) in which the High Court found that Google was not liable for defamation when it merely provided a hyperlink to defamatory material in results on its search engine.
Ultimately, this exemption makes it clear there is no liability for automatically generated defamatory search results (that are not sponsored search results).
This exemption is helpful for search engine providers that have no interest in the specific content that has been automatically generated by the search engine in response to a user query. It addresses an increasing trend to sue the search engine provider, rather than the originator of the defamatory material, as potentially an "easier" (and, from the perspective of the plaintiff, a more pecunious) target for litigation. Search engine providers will be hopeful that this reform encourages plaintiffs to target their efforts towards the originators of defamatory material.
Under the new section 10C, a digital intermediary will be exempt from liability for defamation where:
a) the matter has been published using a caching service, a conduit service, a storage service or a combination of those services;
b) the intermediary's role in the publication was limited to providing one or more of these services; and
c) the intermediary did not take an active role in the publication.
This exemption will apply to those services like internet service providers (ISPs), cloud storage services or email services, where the service has been used to make defamatory material available (e.g. through a defamatory email), but has otherwise played no part in the publication.
There had been some concern expressed by stakeholders that this provision overlaps with section 235 of the Online Safety Act 2001 (Cth). Section 235 of that act allows for companies responsible for online platforms to not be liable for defamatory content unless they are aware of it. Stakeholders were concerned that the potential inconsistencies between these provisions (i.e. the distinction between being aware vs taking an active role) may lead to confusion about when a digital intermediary will be liable for a defamatory publication made utilising its services. In the Communique announcing the reforms, it was also announced that the Commonwealth Government will prepare an exemption to state and territory defamation laws from section 235, to resolve this potential confusion.
Section 39A grants courts the power to make orders against non-party digital intermediaries concerning defamatory digital matters. Where a court has made a final judgment, or issued an injunction, the new provision allows the court to order a digital intermediary (who was not a party to the relevant litigation) to take access prevention steps to be made to prevent or limit the continued publication or republication of the matter complained of. In addition, these orders can require for these intermediaries to comply with the judgment, injunction or other order that has occurred in defamation proceedings.
Concerns have been raised through the consultation process that there is a lack of geographical and temporal limitations on this power. It is unclear whether the power is limited to making orders in respect of access by Australian users of the service, or if the orders are intended to apply globally. It is further unclear whether the orders are to be limited to accessing content at specific locations on the internet (e.g. a specific URL), or whether it is expected that the digital intermediary is expected to undertake a monitoring exercise and remove any additional instances of the content that arise. Such a requirement would be resource-intensive and unduly burdensome on an entity that is not even party to the matter, and does not appear to have an avenue to recover its costs of compliance.
Preliminary Discovery: Section 23A allows a person who is bringing defamation proceedings for the publication of digital matter can obtain an order for, or in the nature of, preliminary discovery, for the purposes of:
a) obtaining information to assist in identifying the individual who published the matter; or
b) assisting in locating physical or digital addresses for concern notices to be given or defamation proceedings to commence against them.
In determining whether to make these orders, a court must consider privacy, safety and public interest considerations which might arise. It is not clear why this provision has been added in circumstances where preliminary discovery is already available to potential plaintiffs.
Offer to Make Amends: Section 15 has been updated to allow for an offer to make amends in respect of online publications to include an offer to take access prevention steps.
Electronic Service: Section 44 now states that service may now be achieved not only through email, but also messaging or other electronic communication to an electronic address or location indicated by either person or body corporate it is being served or given to. This will assist applicants whose only knowledge of the individual defaming them is an account (or screen) name.
Part B of the reforms aims to protect victim-survivors from the threat of defamation when they report alleged criminal and unlawful conduct and misconduct to police.
The defence of absolute privilege as it was previously set out did not apply generally to publications of defamatory matter to police forces or services. There was concern that the threat of potential defamation proceedings may be deterring some people from making complaints to police forces or other services.
The amended defence will now cover publication of allegedly defamatory material to an official of a police force or service of an Australian jurisdiction while the official is acting in an official capacity. In this instance, an official is an official of a police force or service where they are an officer, employee or member of staff of the police force or service, or another person engaged to act for or on behalf of the police force or service.
This reform is part of a larger trend of looking at how victims of crime are treated in defamation proceedings, and in particular victims of sexually-based offences (for example Victoria's recent amendments to the Judicial Proceeding Reports Act 1958 (Vic) to protect victims' privacy in civil proceedings). There is concern that this reform does not go far enough, given it only protects publication to police, and not other support services that a victim may approach.
Authors: Robert Todd, Partner; Nick Perkins, Counsel; Imogen Loxton, Senior Associate and Prashana Coomarasamy, Graduate.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.