Introduction
Two recent proposals by the Australian Government demonstrate a far more aggressive approach to technological regulation of digital services than we see here in New Zealand.
The first proposal is that the Labour Government is to introduce legislation to combat seriously harmful misinformation and disinformation.
Needless to say this proposal has serious implications for freedom of expression, especially of opinions and what could be described as contrarian views.
The second proposal is to ban young people from accessing social media.
This too raises a number of issues. Young people – digital natives - get much of their information from online platforms and social media. Once again a freedom of expression issue arises. Is the potential and actual harm caused by social media sufficient to warrant a total ban on its use by a sector of the community? Another issue is that of respect for the law. It is unclear as yet how such a ban will be enforced and how it may be policed.
In this article I shall have a look at these proposals and comment on them. They are only proposals at this stage and will need to be fleshed out. The devil will necessarily be in the detail.
Combatting Seriously Harmful Misinformation and Disinformation
The proposal advanced is to give the Australian Communications and Media Authority (ACMA) new powers to hold digital platforms to account and improve efforts to combat seriously harmful misinformation and disinformation.
The ACMA is an independent statutory authority responsible for regulating communications and media services. It oversees a wide range of industries, including telecommunications, broadcasting, radio, and the internet, ensuring they operate in accordance with the laws and standards set by the Australian government.
Key roles and responsibilities of the ACMA include:
· Regulating Broadcast and Online Content: Ensures that television, radio, and online content complies with relevant broadcasting codes, classifications, and standards. This includes issues like accuracy, fairness, and decency in programming.
· Telecommunications Regulation: Oversees the licensing and operation of telecommunications providers, ensuring they adhere to regulatory frameworks around service delivery, network infrastructure, consumer protection, and competition.
· Spectrum Management: Manages the allocation of radiofrequency spectrum to various sectors, such as telecommunications, broadcasting, and public services (like emergency communication).
· Consumer Protection: Implements rules around privacy, spam, telemarketing, and other consumer rights in communications and media services.
· Internet Regulation: Involved in regulating certain aspects of internet use in Australia, including cyber safety, online content that is harmful or illegal, and promoting safe use of the internet for children.
The ACMA plays a central role in ensuring that the communications and media industries are transparent, competitive, and serve the public interest, while also maintaining standards that protect consumers and citizens from harmful content and practices.
The new powers are contained in the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024.
The focus of the Bill is to combat the most seriously harmful content on digital platforms, whilst at the same time containing strengthened protections for freedom of speech.
The Bill follows an extensive public consultation process with key stakeholders to refine and prepare the Bill for introduction.
The Bill recognizes the benefits provided by online platforms but also recognizes that these same platforms can serve as a vehicle for the spread of misleading or false information that is seriously harmful to Australian’s health, safety, security and wellbeing.
The ACMA will be empowered to oversee digital platforms with new information gathering, record keeping, code registration and standard making powers and will also introduce new obligations on digital platforms to increase their transparency with Australian users about how they handle misinformation and disinformation on their services.
The Bill will complement voluntary industry codes but allow the ACMA to approve an enforceable industry code or make standards should industry self-regulation fail to address the threat posed by misinformation and disinformation.
The Bill does not contain any takedown powers regarding individual pieces of content or user accounts.
The target of the Bill is clearly the online platforms. They will be responsible for managing content on their services.
In some respects the Bill has echoes of some of the elements of the Safer Online Services and Web Platforms proposals that were put forward by the Department of Internal Affairs. This project has been abandoned by the present Government but it may be that the Australian proposals could form a template for future proposals.
The Bill was introduced and given a first reading on 12 September 2024. A second reading was moved on the same day. Details surrounding the Bill can be found here.
The Bill itself covers some 73 pages. I shall examine the Bill in another article. I shall make a couple of observations.
Disinformation
Disinformation is defined in Clause 13(2). This states:
“dissemination of content using a digital service is disinformation on the digital service if:
(a) the content contains information that is reasonably verifiable as false, misleading or deceptive; and
(b) the content is provided on the digital service to one or more end users in Australia; and
(c) the provision of the content on the digital service is reasonably likely to cause or contribute to serious harm; and
(d) the dissemination is not excluded dissemination; and
(e) either:
(i) there are grounds to suspect that the person disseminating, or causing the dissemination of, the content intends that the content deceive another person; or
(ii) the dissemination involves inauthentic behaviour.”
Misinformation
Content is misinformation if
“(a) the content contains information that is reasonably verifiable as false, misleading or deceptive; and
(b) the content is provided on the digital service to one or more end users in Australia; and
(c) the provision of the content on the digital service is reasonably likely to cause or contribute to serious harm; and
(d) the dissemination is not excluded dissemination.”
It will be noted that the element of intention to deceive or the dissemination involves inauthentic behaviour is what differentiates disinformation from misinformation.
In determining whether the provision of content on a digital service is reasonably likely to cause or contribute to serious harm, regard must be had to the following matters:
(a) the circumstances in which the content is disseminated;
(b) the subject matter of the information in the content that is reasonably verifiable as false, misleading or deceptive;
(c) the potential reach and speed of the dissemination;
(d) the author of the information;
(e) the purpose of the dissemination;
(f) whether the information has been attributed to a source and, if so, the authority of the source and whether the attribution is correct;
(g) other related information disseminated that is reasonably verifiable as false, misleading or deceptive;
(h) any matter determined by the Minister under subclause (4);
(i) any other relevant matter.
Serious harm is defined in clause 14.
Serious harm is:
(a) harm to the operation or integrity of a Commonwealth, State, Territory or local government electoral or referendum process; or
(b) harm to public health in Australia, including to the efficacy of preventative health measures in Australia; or
(c) vilification of a group in Australian society distinguished by race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin, or vilification of an individual because of a belief that the individual is a member of such a group; or
(d) intentionally inflicted physical injury to an individual in Australia; or
(e) imminent:
(i) damage to critical infrastructure; or
(ii) disruption of emergency services;
in Australia; or
(f) imminent harm to the Australian economy, including harm to public confidence in the banking system or financial markets;
that has:
(g) significant and far reaching consequences for the Australian community or a segment of the Australian community; or
(h) severe consequences for an individual in Australia.
Subclause (c) is very wide in scope.
One of the problems with the definitions is that they could include statements of opinion that have been developed from verifable facts. It may well be that the statement of opinion is wrong or that the argument and conclusions in the statement of opinion do not stack up and may be false or misleading. In such a case the expression of a contrarian opinion may qualify as misinformation which an online platform would be required to monitor.
This is an issue which immediately occurs to me. A closer study of the Bill and the background material will no doubt raise other issues which I shall deal with in a more detailed article on this subject.
Suffice to say, however, that it comes as no surprise that this Bill emanates from a Labour Government that clearly sees an opportunity in controlling the narrative by requiring online platforms to act as censors.
Children’s Access to Social Media
Australian Prime Minister Anthony Albanese has supported calls to limit children’s access to social media and Opposition leader Peter Dutton has pledged to implement a ban on under-16s using the platforms. The Australian Government has also pledged $6.5 million to trial age verification technology.
At the moment these are just proposals. No legislation has been introduced although the proposal is to have legislation introduced by the end of 2024.
This follows South Australia’s plan, announced on 9 September, to restrict access to people aged 14 and over, and the Coalition announcing earlier this year it would ban children under 16 from social media within 100 days if it won the next election.
A major problem with the proposals is ensuring an effective age verification technology.
Furthermore a ban is a heavy-handed approach and is probably typical of a governing generation of digital immigrants who are trying to regulate a coming population of digital natives.
Digital technologies have introduced a paradigm shift especially in the area of communications and our expectations and use of information. Many of the assumptions that the older generation may have about correct behaviour and values are neither valid nor reasonable for a generation of digital natives. No better example may be found in the statement by Mr Albanese when he said
“Parents want their kids off their phones and on the footy field. So do I. We are taking this action because enough is enough”
This is clear evidence that there is a total lack of understanding about the impact that digital technologies have had on values and behaviour. The fresh air values of Mr Albanese’s generation may no longer have relevance to digital natives. What is seen as addictive use of devices by Mr Albabese’s generation is seen as communication by digital natives. This lack of understanding, of a desire to force a new generation to adopt behaviours of an older one crumbles in the face a paradigmatic change.
But there are a number of other issues that need to be considered and were highlighted by Brent Carey, CEO of Netsafe.
Mr. Carey is of the view that bans
“can prevent children discussing how they’re using online spaces with the trusted adults in their lives and risks driving any subsequent social media use underground. The real question we should be asking is how can we make social media better for children and young people so they have safe, playful, exploratory, fun, entertaining, positive and educational experiences online.”
The age verification issue raises its own problems. Mr Carey comments:
“If you’re going to verify age, you’re going to have to do it for the whole population. How do we feel about platforms collecting biometric information? What are the privacy risks?”
Care needs to be employed in considering the use of biometric information such as facial recognition or facial comparison technologies.
If the government pursues a method where companies require all users – not just those younger users – to verify their ages before being able to use a platform, it could result in social media companies being forced to collect user identification. That raises serious privacy issues
Finally there are issues of consistency in legal regulation and what may or may not be legal for a particular age group. Mr Carey points out:
“At 14, you can be left alone to babysit, or your parents can leave you at home, but you can’t Snapchat with your friends? It just doesn’t make much sense.”
The Chief Censor, Caroline Flora, has also expressed a view observing that critical thinking and open conversations are important in preventing and addressing harm in this context and even the Australian e-Safety Commissioner Julie Inman Grant recognizes that a ban may drive young people underground or to locate other platforms with which to communicate. Young people banned from Facebook, TikTok and SnapChat could congregate online via the likes of WhatsApp groups, messaging services like Discord and online gaming forums.
Celia Robinson, on the other hand, criticizes Mr Carey’s approach. She lumps him in with what she disparagingly refers to as “a noisy few” (which I gather includes the Chief Censor) and suggests that
“One of the first steps is to recognise how some organisations, meant to advocate for children, are instead partnered with those who perpetuate the harm. This conflict of interest must be addressed if we are serious about safeguarding our children’s futures.”
It would have been better to have been direct rather than using inference or innuendo. Netsafe does a fine job in its role as the Approved Agency under the Harmful Digital Communications Act which warrants not a mention in Ms Robinson’s op-ed.
I gather that as well as being critical of Mr Carey’s approach, Ms Robinson favours the Australian proposals or some other form of heavy-handed paternalistic state intervention.
It should be emphasized that what is being discussed is still in the proposal stage. But it is clear that there is an appetite in Australia for Mr Albanese’s proposals. Fortunately, at the moment, that hunger has not spread to New Zealand. Although if Ms Robinson had her way, such a hunger should be welcomed.