Introduction
In my article about restricting access to social media by under-16 year olds I said:
“Every bone in my body screams that I should remain silent on this issue - at least for the moment. One reason is that the Social Media Age-Appropriate Users Bill mentioned in the article is not yet available on the Legislation website and therefore any detailed comment on the proposal would be premature.”
Accordingly I refrained from detailed comment about a Bill the contents of which I had no detail and restricted myself to some preliminary observations
I have managed to secure a copy of the Social Media Age-Restricted Users Bill.
It helps to clarify a few issues and in fact is not as wide-ranging as the Mainstream Media reporting on the subject, but it still has problems and is something that is going to require a lot more work.
What the Bill Does NOT Do
The Bill has been touted as a means by which children under-16 will be unable to access social media. Their access will be controlled by an age-verification system which must be put in place by the provider.
But the Bill does not restrict access to ALL social media platforms. Access will be restricted only to platforms that are designated by the Minister as an age-restricted social media platform. The Bill sets out a number of tests that are required before such designation. More on that later.
That means that the Bill does not take people under the age of 16 off line in the sense that they will be unable to access ANY social media platforms. Unless a platform has been designated and thus requires age-verification, it will be available to <16’s.
Clearly the reporting about the policy and the proposal has been erroneous and lacks nuance. The responsibility for this lies in the hands of the Prime Minister and the MP advancing the Bill Ms. Catherine Wedd, ably assisted by some exaggerated reporting from MSM.
It would have been helpful had it been made clear that the Bill was not going to automatically apply to ALL social media platforms but only to those which fulfilled the requirements set out in Clause 14(2) of the Bill.
But of course that would have diluted the impact of the announcement and the publicity surrounding it – an example of controlling the message about which I have written and about which I shall have more to say in subsequent articles.
The Bill
The Bill comprises a Policy Statement, a Clause-by-Clause analysis and then the body of the Bill itself. A copy of the Bill is available here.
One thing to note is that the draft is dated 21 March 2025 at 10:50 am. Clearly the Bill has been slumbering for over a month, waiting for the right moment to make an appearance.
In this article I will summarise the content of the Bill and comment and focus attention on a few issues that arise. I will then make some observations about enforceability.
The Provisions of the Bill
The purpose of the Bill is stated as protecting
“young New Zealanders from the harms of social media by regulating access for individuals under the age of 16. The policy aims to:
· reduce the risks to young people of cyberbullying, harmful content, and online exploitation
· safeguard young users' privacy from commercial exploitation and data breaches
· encourage healthier social interactions and offline activities.”
This is an interesting policy statement. The first comment about the harms of social media clearly ignores the provisions of the Harmful Digital Communications Act 2015. It is seen as a way of mitigating harm if young people under 16 are banned from social media.
A similar observation could be made about reduction of risks to young people from cyberbullying and the like. Once again the HDCA is available.
But the bigger question is whether the Government should be operating in this space at all. This is the province of parents. It is not enough to say that parents are failing. The Censor in the Report on “Content that Crosses the Line” argues resources should be made available to assist parent/child communication so that children and young people need not feel reticent about approaching their parents and discussing their concerns about their reactions to online content.
Essentially what the State is doing is nominating the platforms that require age verification and taking the decision as to which platforms are accessible out of the hands of the users and more importantly their parents. If a platform is designated as requiring age verification, and the parents of an <16 actually wanted their child to have access, that access would be unavailable – unless a way of circumventing the age verification procedures was employed. However, that would be to encourage disrespect for the law. Not a good example to set.
The final remark sounds like something that Anthony Albanese said about getting kids out of the house and into the playground. In May 2024 in discussing the Australian ban on under 16’s accessing social media he said
“What we want is our youngest Australians spending more time outside playing sport, engaging with each other in a normal way and less time online, and one way to do that is through restrictions on social media.”
The State should not be dictating what may be healthy activities for young people. That is up to parents.
What Does the Bill Do?
The Bill mandates that social media platforms implement strict age verification measures to prevent under-16s from creating accounts.
It introduces regulatory oversight to ensure compliance and penalise non-compliant platforms.
Additionally, it promotes digital literacy and public awareness programs to educate parents and children on safe online behaviour.
The last element is something that the Censor recommended and might be a proper objective for State involvement. The provision of resources to help parents deal with issues that their children might raise about online activity. At the moment children seem to be “bottling up” their concerns. They are not discussing the problems that they encounter with their parents or other trusted adult advisers. Banning them from accessing social media is a blunt approach to a complex and nuanced problem.
The Focus of the Bill
The Bill doesn’t prohibit under 16’s from accessing social media. It states that age verification is required for designated platforms.
The Bill proposes that if an <16 tries to open an account with a designated provider they will encounter an age verification block that they will have to satisfy.
If they cannot satisfy that block they will be unable to register for the social media platform. The Bill places the onus on social media platforms to take reasonable steps to prevent an under 16 from becoming an account holder on the platform.
Definitions
When lawyers look at a statute the first port of call is the section on interpretation. How are terms interpreted for the purposes of the statute (or in this case the Bill).
An account holder in relation to a social media platform
(a) means the person that is registered as a user of the platform; and
(b) includes a person that has an account, or has established a profile, with the platform
Social media platform is defined next in this overview. In the Bill definitions are listed alphabetically.
I am a little puzzled at the use of the conjunctive “and” at the end of clause (a). Usually when the conjunctive is used there are two limbs to a test. But in this case I think the word is there for the purposes of clarity. It is to be noted that the tenses used are present “is registered” or past “has an account” or “has established a profile”.
The implications of this definition are quite significant.
A social media platform is defined as:
(a) an electronic platform that satisfies the following conditions:
(i) the sole or primary purpose of the platform is to enable online social interactions between 2 or more end-users:
(ii) the platform allows end-users to link to, or interact with, some or all of the other end-users:
(iii) the platform allows end-users to post material on the platform; or
(b) an electronic platform specified in regulations.
In the previous comment I referred to the use of the conjunctive “and”. In this definition the disjunctive “or” makes an appearance. But the problem here is that the conditions to clause (a) is confusing in that at the end of (i) and (ii) there is neither “and” nor “or”. I think what is intended is that all three conditions have to be fulfilled to qualify as a social media platform but so there could be no doubt clause (i) and (ii) should be followed with the conjunctive “and”.
It is to be noted that the platform has to have a primary purpose of enabling online social interactions. Problem is that there is no definition of what constitutes “social interactions”.
Arguably this Substack could qualify. It is an electronic platform. It allows end users to post material. It allows interaction between users (in the comments section). Its primary purpose is to make my writing available to you, the readers. The social interactions element is secondary. So as the definition stands I might just avoid legislative capture. And that assumes that this Substack would avoid designation under Clause 14(1) and (2).
But wait – there’s more.
Subsection 2 of the definition says that online social interaction includes online social interactions that enables end-users to share material for social purposes.
Social purposes are not defined,
The following purposes must be disregarded in determining what the primary purpose of the platform is namely the provision of advertising material on the service and/or (once again conjunctive/disjunctive absence) the generation of revenue from the provision of advertising material on the platform.
A platform is an exempt platform if:
(a) none of the material on the platform is accessible to, or delivered to, one or more end-users in New Zealand; or
(b) the platform is specified in regulations.
At last – the use of a disjunctive!
Two references in this suite of definitions refer to regulations which clearly are yet to be made. This form of “secondary legislation” allows platforms to be added to the list of social media platforms as well as those that may be excluded.
There are some other important definitions.
Provider means a provider of an age-restricted social media platform
Nothing too complicated or controversial here although the precise platforms (and therefore the providers) will have to wait for the regulations.
Age-restricted social media platform means a social media platform designated by regulations as an age-restricted social media platform.
Thus, although social media platforms are defined, those that are caught by the Bill will be designated in the regulations. This is very important and underpins the fact that to qualify for age verification the platform must be a designated one. The words “designated by regulations” are critical to the applicability of age-verification requirements.
An age-restricted user means a person under the age of 16 years
Nothing too difficult in this definition. Now let’s see how all this is going to work.
Operative Language
The obligations of a provider are deceptively simple. Clause 7 of the Bill states:
A provider of an age-restricted social media platform must take all reasonable steps to prevent an age-restricted user from being an account-holder with their age-restricted social media platform.
This is the clause that clarifies the obligations of a provider or social media platform. They must take “all reasonable steps” to prevent an age restricted user (see the definition) from being an account holder. That they may be an age restricted social media platform awaits designation in the regulations.
Remember – an age-restricted platform must be one that is so designated by the Minister pursuant to Clause 14.
Lawyers love playing around with what is reasonable and what is not. I can remember having a debate in another context with a person who said that “reasonable” was a term invented by lawyers to mean whatever they liked. Clearly the drafter of the Bill (who is deficient in a number of areas) has anticipated that.
Clause 8 of the Bill sets out what constitutes reasonable steps
“Without limiting section 7, reasonable steps means that which is, or was, at a particular time, reasonably able to be done by a provider in relation to preventing an age-restricted user from being an account-holder with their age-restricted social media platform, taking into account and weighing up all relevant matters, including-
(a) the privacy of the age-restricted user; and
(b) the reliability of the method used for a person to assure the provider that they are not an age-restricted user.”
This is really clumsy language. Let’s break it down. “Reasonable steps means that which is reasonably able to be done by a provider” so in using that language there are few if any guidelines and a clear indication that what amount to reasonable steps is fact specific in each case. The certainty that the law requires is absent.
There are a couple of relevant matters that must be taken into account. One is the privacy of the age-restricted user (if an age restricted user is able to open an account which the Bill proposes to prevent there would be no privacy implications, so this requirement is unclear). The other is the reliability of the method used for a person to assure the provider that they are not an age-restricted user. So if an age restricted user gets through the net, as long as the method of age verification is reliable the provider can be said to have taken reasonable steps.
As if to emphasise that this is a case specific test in addition to the two matters mentioned above deciding whether the steps are reasonable involves taking into “account and weighing up all relevant matters”. This is a broad test indeed. For myself I think that the words “and weighing up” is redundant and all that should be required is “taking into account all relevant matters”. But even so, the test is vague, fact specific and uncertain. It will be difficult for providers to ensure that they have satisfied a test that is so vague.
The five following sections address the matter of enforcement.
A breach of the Act (when it becomes an Act) will not be an offence. Rather it will attract a pecuniary penalty which is one of the ways to ensure compliance. The Unsolicited Electronic Messages Act 2007 dealing with spam was an early example. If a Court is satisfied there has been a breach of Clause 8 it may impose a pecuniary penalty of up to $2,000,000.
In determining whether to make a pecuniary penalty and the amount thereof the Court must have regard to the following matters:
(a) the extent to which the provider's conduct undermines the purposes of the Act:
(b) any harm caused to an age-restricted user as a result of the provider's failure to prevent them from accessing their age-restricted social media platform:
(c) whether the provider has taken steps to mitigate their failure:
(d) whether the provider's conduct was intentional or reckless:
(e) the circumstances of the provider's conduct:
(f) whether the provider has previously engaged in similar conduct:
(g) any other matters the court considered relevant.
It is important to note that Clause 12 provides a defence.
It is a defence for a provider to prove that their failure to prevent an age-restricted user from accessing their age-restricted social media platform was due to reasonable reliance on information provided by an age-restricted user.
Thus if a 15 year old hands her phone over to her 16 year old friend, the friend does a RealMe or whatever login on that phone and verifies for the social media app, then hands the phone back to the 15 year old the provider could rely on the information provided by the 16 year old.
Finally the rules of civil procedure and civil standard of proof (on the balance of probabilities) apply, emphasising that this is a civil penalty.
Regulations
It will have been noted that Regulations play a significant – indeed critical - role in this Bill.
First, there is the power to make regulations for the following purposes
“(a) providing for anything this Act says may or must be provided for by regulations:
(b) designating a social media platform as an age-restricted social media platform:
(c) designating an electronic platform as a social media platform:
(d) specifying an exempt social media platform:
(e) providing for anything incidental that is necessary for carrying out, or giving full effect to, this Act.”
It will be remembered that when it comes to designating a social media platform as an age restricted platform the Minister may only make regulations if:
(a) the Minister is satisfied that it is reasonably necessary to do so in order to minimise harm to age-restricted users; and
(b) the Minister has received advice from the chief executive and has had regard to that advice; and
(c) the Minister has consulted with providers of platforms proposed to be designated as an age-restricted social media platform; and
(d) the Minister has received advice from any other entity that the Minister consider relevant and has had regard to such advice.
Note the use of the conjunctive “and”. This means that the Minister must be satisfied as to all four requirements before he or she may designate a social media platform as an age-restricted platform.
Clause 14 which contains the rules relating to regulations is a critical part of the Bill. It puts target of the Bill into narrow focus. The earlier discussion about what amounts to a social media platform and to whom age-verification applies is sharpened down to only those platforms designated by the Minister.
Review
Finally after 3 years the operation of the Act is to be reviewed and any amendments that are necessary or desirable must be considered. The findings of the review are reported to the Minister who must present a copy of the report to Parliament.
Issues and Enforceability
1. Jurisdiction – the Bill excludes from its scope platforms that are not accessible from New Zealand.
That means that it is likely to apply to platforms that are accessible in New Zealand (I know – stating the obvious).
Remember too that the platform or provider must be one that is designated as an age-restricted platform. We must keep that in mind as this discussion proceeds.
The first question is, does the platform have a physical presence in New Zealand for the purposes of being brought before the Court? Or will it be necessary for the prosecuting agency – the Chief Executive of the Department (which is neither named nor identified) to have to apply for leave to serve proceedings overseas so that jurisdiction could be asserted.
And then the issue becomes one of whether or not the off-shore platform does anything about it or lets the law in this Pacific backwater take its course.
It may be a different matter if the platform has a presence here in New Zealand – Facebook is an example. But the issue becomes one of whether Facebook NZ is a provider within the scope of the Act or whether it handles a non-provider aspect of Facebook’s business – for example advertising. It could well be argued that because it does not fall within the ambit of a provider as defined, it is not liable.
The Bill does not have an extra-territorial clause – something that is very rarely present in legislation. Certainly the fact that off-shore activity is having an effect here in NZ. Canadian academic Michael Geist would suggest that jurisdiction is acquired when an audience is targeted. But our rules about asserting jurisdiction depend on service of proceedings.
These issues remind me of questions that I used to set for exams in my Law and IT course. All I can say at this stage is that the issue of jurisdiction is not clear. And it should be. Perhaps it is an additional matter that the Minister should take into account in designating a social media platform as an age-restricted one.
2. Retrospectivity – The Bill becomes law six months after Royal Assent. That means that as from the date that the Act becomes law then its requirements apply.
From the date of commencement providers are required to have age verification systems and prevent <16’s from accessing their platforms. Once again, of course, these providers have to those that have been designated as requiring age-verification.
What is unclear (there are so many unclear things about this Bill) is whether the Act will apply to <16’s who registered and created an account BEFORE the Act commenced.
If the Act does not apply to that cohort that would mean firstly that there would be a significant number of young people who had access to provider platforms which the Act would otherwise disallow.
Secondly the cohort that tried to apply for an account AFTER commencement would be disallowed registration which would make the effect of the Legislation discriminatory.
If the proposal is that the Act should be retrospective – and it would require clear words in such a case – it would be in breach of section 12 of the Legislation Act 2019 which provides that legislation does not have retrospective effect.
That said were it to be retrospective, it would mean that providers would have to verify the ages of existing users to ascertain if their accounts should be maintained or cancelled. That would place an extremely heavy burden on providers.
3. Application – As it stands the Bill does not apply to young people under 16 who want to set up an account with social media. EVERYONE who wishes to set up an account with a designated age-restricted platform will be required to verify their age before they will be allowed an account. That means EVERYONE. Those <16 will be excluded from joining. But anyone over 16 will still have to undergo an age-verification process.
This makes the system unnecessarily onerous for those >16 and it also means that data will be collected about both cohorts that will require an adjustment to privacy and data collection policies on the part of providers.
It seems to me that this aspect of the matter has received little or no consideration. It might be described as an “unintended consequence”.
In addition, there must be concerns about the way in which the age verification process will take place. It could well mean that there would have to be some form of universal age verification method approved by the State which has significant implications for State surveillance and control. Some commentators have observed that this proposal is all about control, but I wonder if they are aware of the potential for such control.
4. Freedom of Expression: The proposal is a significant interference with the right of young people to impart and receive information, even although it is platform specific in that the platform must be designated as an age-restricted one.
That is a breach of the freedom of expression that is guaranteed by section 14 of the New Zealand Bill of Rights Act. The importance of the freedom of expression was casually dismissed by Ms. Catherine Wedd who is behind the Bill. She suggests the Bill doesn’t breach NZBORA.
Because it is an interference with the freedom of expression it does. What Ms. Wedd should have done was to explain why this proposal amounted to a reasonable limitation that could be demonstrably justified in a so-called free and democratic society within the meaning of section 5 of NZBORA. It is not good enough to say that it is obvious. Ms. Wedd is proposing a limitation of the freedom of expression. Let her justify it.
The way that could have been done is for the proposal to have been explained in a far more nuanced way. Once it is established that the proposal is limited in its application to those platforms designated as age-restricted ones it becomes easier to argue that such designation amounts to a justified limitation of the freedom of expression right. But that might be a bit too nuanced, and of course detracts from the impact and control of the messaging which suggests that the Bill will apply to all social media platforms when clearly it will not.
Perhaps a further matter that the Minister should take into account under Clause 14(2) is the impact that designation should have on freedom of expression. Factors such as the number of users and the impact on their freedom of expression would have to be taken into account. This, of course, is a very blunt argument as I have stated it which would require refinement.
6. Enforcement and Effectiveness – if the law is to have any credibility it must be enforceable. Although the civil penalty is set at a high level and the standard of proof is not at the criminal level, the law must be capable of effective enforcement.
What is significant in this proposal is that the age-designated platforms are charged with the responsibility of employing reasonable age-verification programmes. That is the limit of their responsibility.
Nothing is mentioned about the steps that <16 users may take to circumvent the age verification processes. I have already given one example.
Another is the use of a virtual private network or VPN. Given that the proposal will apply only to New Zealand, age verification will not be required for someone setting up an account in Los Angeles. The location of a user is determined by the Internet Protocol Number assigned to them for a session. These IP numbers are allocated geographically. The IP number for a person in LA is different from that of a person in New Zealand. A VPN programme allows a user to “spoof” an IP number so that the receiving equipment “thinks” the user is located in LA rather than AKL.
The issue of circumvention is not something with which the providers have to be concerned. Their obligation is to have a reasonable age verification system in place. Circumvention of an age verification system is something in which potential users will engage. This has implications not only for enforcement of the law, but for respect for the law; for issues of dishonesty and potentially fraudulent behaviour.
But circumvention of technical blocks – and age verification is one of those – has been with online and digital systems from the outset. Circumvention of geo-blocking technologies such as region-coding which used to be a feature of DVDs is an example. So great was the problem that laws prohibiting the use of circumvention devices were enacted – stringent ones in the USA and less stringent ones in New Zealand.
Another aspect of circumvention of geo-blocking technologies lies in obtaining access to overseas streaming services such as Hulu or the PBS streaming site out of the US. The simple use of a VPN allows access to these otherwise blocked services.
Associated with the issue of enforcement and effectiveness is another behaviour of users which is known as regulatory arbitrage
Regulatory Arbitrage – When I started teaching IT and Law one of the things that was being talked about was the way that the Internet would be governed and the way that States could assert jurisdiction over the Internet. There were a number of theories and one of them that is relevant to this discussion was developed by Michael Froomkin who delivered a paper in 1996 entitled “The Internet as a Source of Regulatory Arbitrage.” Froomkin’s theory went like this.
The Internet is a transnational communication medium. Once connected, there is little that a single country can do to prevent citizens from communicating with the rest of the world without drastically reducing the economic and intellectual value of the medium.
As a result, connection to the Internet enables regulatory arbitrage by which persons can, in certain circumstances, arrange their affairs so that they evade domestic regulations by structuring their communications or transactions to take advantage of foreign regulatory regimes.
Regulatory arbitrage reduces the policy flexibility of nations by making certain types of domestic rules difficult to enforce.
Citizens with access to the Internet can send and receive anonymous messages regardless of national law; both censorship and information export restrictions become nearly impossible to enforce, although governments have it in their power to impose some impediments to ease of use.
In this context young people might avoid those providers who fall within the scope of the Act and register with other providers who are not nominated, thus using the flexibility that the Internet offers to choose their medium of communication.
Certainly the Bill itself provides an impetus to regulatory arbitrage in that by designating platforms as age-restricted ones the law is driving users to those platforms that do NOT have age-verification requirements.
7. Assumptions – This Bill is underpinned by certain assumptions. These are stated in the general policy statement which justifies the legislation by saying
“This approach is essential given the increasing negative impacts of social media on young users.”
This raises the assumption that ALL social media platforms are prime facie harmful. This is incorrect. And as I have emphasised throughout, the Bill does not apply to ALL platforms – only those that have been designated.
This image from 2020 shows the plethora of platforms divided into groupings.
It cannot reasonably be asserted that every one of these platforms contain potentially harmful material or are vehicles for harmful behaviour. Social media platforms can be used for a number of activities, some of them quite benign.
Certainly in designating a platform as an age-restricted one the Minister must take the four criteria mentioned into account. Clearly all platforms will be so designated. Thus the proposal is not a universal ban on all social media.
The suggestion that social media platforms increase negative impacts on young people is a generalisation, and over-simplification of what is in fact a complex and nuanced ecosystem and the prime driver behind this Bill – apart from the obvious desire for control – is fear.
Another assumption is that this Bill will align New Zealand with efforts in Australia to implement a similar regime.
“Aligning with international efforts, including Australia's recent policy shift, the Bill provides a proactive, enforceable solution to safeguard children while fostering a safer digital environment.”
One thing that characterises the Australian approach is that it is part of a wider regulatory approach designed to bring the big platforms to heel. This has been a policy not only of the Albanese Government but the Morison Government that preceded him. The Australian Online Safety legislation targets platforms as does the equivalent to the Fair Digital News Bargaining Bill and now this restriction on social media activity. New Zealand policy to date has not involved a targeting of the platforms although the Safer Online Services and Web Platforms had more than a hint of that objective.
Conclusion
As will be seen from this discussion this Bill has a number of problems. Indeed, the policy behind it is flawed. It fails to understand the nature of the technology that its targets. It fails to understand user desires. It interferes with a communications system that is widely used by young people. It has problems in the manner in which it holds providers to account. It is potentially discriminatory or, if not that, retroactive.
But on the other hand, because it has been sold and publicised in the way that it has, the impression is that it applies to all social media when it does not.
However, if this had been submitted to me as a legislative solution to the problems of social media it would have been handed back with a “try harder” suggestion together with a recommedation first, that we wait and see what happens in Australia. Their legislation has problems already and an exception has been made for YouTube which does not fall with the scope of the Australian proposal.
Secondly, I would abandon this current effort and have a look at a totally different model – one that understands user desires, that recognises the importance of a communications system and works towards assisting and empowering parents to deal with online activities and likely harms that may arise.
Thirdly, if this defective and unfortunate Bill were to proceed some consideration should be given to recasting it – ideally along the lines suggested in my second point immediately preceding.
Certainly if it gets drawn from the biscuit-tin and gets to a First Reading a detailed Select Committee examination would be required and I would imagine that there would be significant changes.
If that does not happen and the Bill is enacted in its present form it will fail in its objective and reduce and degrade respect for the law as young people in their droves circumvent technologies, engage in regulatory arbitrage and engage with one another in an ecosystem that is not plagued with this flawed exercise.
It should be noted that young people are particularly astute with IT, and will quickly find ways around this blunt legislation that the drafters of this legislation have no idea of.
Meanwhile, we wrinklies will have to fluff around with verifying that our age is over 16 by providing all our personal details to platforms which we must blindly trust to keep them secure…. Sounds like a data security issue waiting to happen.
Not to mention surveillance…
An excellent analysis of shoddy legislation. I dont know how you find the time to do all this. I'd wager there will be whole workshops for youngsters on how to circumvent this "Wedd Wankyness" It would irritate me intensely that I should have to prove that I qualified for access over 50 years ago, but my VPN solves that. I'm going to send a slightly nastygram to Wedd, suggesting she reads today Haflings View.