Age Restrictions, Social Media and Coincidence
The State Gets Closer to the Holy Grail
In what can only be described as something of a surprise on 23 October Catherine Wedd’s Social Media Age Restricted Users Bill was drawn out of the biscuit tin and awaits a First Reading.
On the same day a Bill amending the Harmful Digital Communications Act was drawn out of the same biscuit tin. This Bill purports to correct an anomaly in the legislation and extend the meaning of an intimate image in the HDCA and the Crimes Act to include deepfake AI created images of identifiable individuals. There is nothing objectionable in this proposal and it too awaits a First Reading and hopefully will pass swiftly.
However what is remarkable if the coincidence that on the same day out of the same biscuit tin are drawn two Bills that address perceived problems in the digital space. The HDCA amendment is, as I have suggested, unremarkable. The Social Media Bill continues a campaign to restrict those under 16 having access to social media platforms.
When the proposal – strongly backed by a group of well-heeled bourgeois individuals who purported to know better than their parents how children in New Zealand should be brought up, and assisted by a compliant Prime Minister who could see easy votes in the proposal – was first put forward it was described as a ban.
It isn’t quite that.
It is a short Bill – only 10 operative clauses in its current form although that may change as it wends its way through the legislative process.
As is the case with all Bills (or Acts for that matter) the devil is in the detail and the detail starts with the stated purpose of the Bill which is to:
“reduce the risk of harm to children from certain kinds of social media platforms by requiring providers of these platforms to take reasonable steps to prevent persons under 16 years of age from accessing them.”
Note the reference to “certain kinds of social media”. The Bill isn’t direct to all social media platforms.
The next “detail devil” is the interpretation section – commendably brief (5 definitions) but in reality uninformative.
An age-restricted social media platform means a social media platform designated by regulations as an age-restricted social media platform.
An amazing example of circularity – an age restricted social media platform is an age restricted social media platform – designated as such by the regulations. Of course there are no regulations so we have no idea what social media platforms are going to be covered.
An age-restricted user means a person under the age of 16 years
No problems there. It is clear enough.
An account-holder, in relation to a social media platform,—
(a) means the person that is registered as a user of the platform; and
(b) includes a person that has an account, or has established a profile, with the platform
This clarifies the scope of the proposal.
A provider means a provider of an age-restricted social media platform
But the problem here is that we don’t know who the providers might be until the regulations identify an age-restricted social media platfrom
Clause 5 actually clarifies part of the definition problem by defining a social media platform.
It reads as follows:
“1) For the purposes of this Act, social media platform means:
(a) an electronic platform that satisfies the following conditions:
(i) the sole or primary purpose of the platform is to enable online social interactions between 2 or more end-users:
(ii) the platform allows end-users to link to, or interact with, some or all of the other end-users:
(iii) the platform allows end-users to post material on the platform; or
(b) an electronic platform specified in regulations.
(2) For the purposes of subparagraph (1)(a)(i), online social interaction includes online social interactions that enables end-users to share material for social purposes.
(3) In determining whether the condition set out in subparagraph (1)(a)(i) is satisfied, disregard the following purposes:
(a) the provision of advertising material on the service:
(b) the generation of revenue from the provision of advertising material on the platform.
(4) For the purposes of this section, a platform is an exempt platform if:
(a) none of the material on the platform is accessible to, or delivered to, one or more end-users in New Zealand; or
(b) the platform is specified in regulations.”
1(a) sets out three criteria for a platform to qualify as a social media platform. It is unclear whether those criteria are cumulative or not. Do all three criteria have to be present? Normally if they were so required there would be a conjunctive “and” at the end of each sub-clause. If they were disjunctive – that is only one of the criteria were needed – then the disjunctive “or” would be present. Neither conjunctive not disjunctive are present.
Then 1(b) repeats the circularity in the definitions section – a a social media platform is an electronic platform (undefined) that is specified in the as yet non-existent regulations.
Would this Substack fulfil the definition of a social media platform. As matters stand (unless some regulatory Orc were to determine otherwise) no because the primary purpose is not to enable social interactions between 2 or more persons although that may be consequential. The purpose is to provide an outlet for my writing, designed to inform and amuse (although I do not discount the possibility of annoyance).
The rest of the definition is reasonably clear.
The rubber meets the road when we come to the part of the Bill that sets out provider duties. Clause 7 states:
A provider of an age-restricted social media platform must take all reasonable steps to prevent an age-restricted user from being an account-holder with their age-restricted social media platform.
No methodology is specified. Just “all reasonable steps”. No specific process for age verification is set out. It is all up to the provider.
So what are reasonable steps. Clause 8 states:
Without limiting section 7, reasonable steps means that which is, or was, at a particular time, reasonably able to be done by a provider in relation to preventing an age-restricted user from being an account-holder with their age-restricted social media platform, taking into account and weighing up all relevant matters, including—
(a) the privacy of the age-restricted user; and
(b) the reliability of the method used for a person to assure the provider that they are not an age-restricted user.
Reasonable steps means something that is reasonably able to be done. The use of circularity in this Bill just gets better and better.
These “reasonable steps” are very wide. They could include age verification processes which could be quite intrusive. But irrespective of whatever may be deployed any person who wishes to become an account holder on a social media platform is going to have to undergo some form of process to determine whether or not they are over or under 16.
The provisions relating to enforcement establish a civil penalty regime. This process is common in this sort of legislation. It was deployed in the Unsolicited Electronic Messages Act 2007. Civil penalties can be imposed for breaches, The standard of proof is on the balance of probabilities (not beyond reasonable doubt) but there is a defence available. It is a defence for a provider to prove that their failure to prevent an age-restricted user from accessing their age-restricted social media platform was due to reasonable reliance on information provided by an age-restricted user.
Finally there are the Regulations. The regulatory powers are stated as follows:
(1) The Governor-General may, by Order in Council made on the recommendation of the Minister, make regulations for all or any of the following purposes:
(a) providing for anything this Act says may or must be provided for by regulations:
(b) designating a social media platform as an age-restricted social media platform:
(c) designating an electronic platform as a social media platform:
(d) specifying an exempt social media platform:
(e) providing for anything incidental that is necessary for carrying out, or giving full effect to, this Act.
(2) For the purposes of subsection (1)(b) and the Minister may only make regulations under that subsection if—
(a) the Minister is satisfied that it is reasonably necessary to do so in order to minimise harm to age-restricted users; and
(b) the Minister has received advice from the chief executive and has had regard to that advice; and
(c) the Minister has consulted with providers of platforms proposed to be designated as an age-restricted social media platform; and
(d) the Minister has received advice from any other entity that the Minister consider relevant and has had regard to such advice.
(3) Regulations made under this section are secondary legislation (see Part 3 of the Legislation Act 2019 for publication requirements).
These are wide powers and vest in the Minister considerable discretion. As long as the Minister fulfils the procedural requirements set out in (2) the regulatory power is available.
I have pointed out some drafting infelicities and anomalies in this Bill. Clause (2) above in the introductory sentence has an “and” which seems to have crept in and complicates the meaning.
However, the limitation of the applicability of the Act to restricted social media platforms means that the Bill does not propose an outright ban but a restriction – quite a significant restriction depending on the platforms designated as age restricted.
Understandably there are issues about the freedom of expression enjoyed by young people and their ability to communicate via social media. One would have thought that parents could make the decisions that need to be made. They don’t need Parliament backed by the B416 pressure group to tell them how to bring up their kids.
The Free Speech Union, predictably, has come out against the Bill. In a press release it states:
The Free Speech Union warns that the Members’ Bill to age-gate social media would do more harm than good. The effect of social media on young people is a conversation that is needed, but any response must protect, not limit, Kiwis’ speech rights, says Jillaine Heather, Chief Executive of the Free Speech Union.
“There are many valid concerns about what social media usage is doing to our young people. But a sweeping ban is not the answer. Heavy-handed government overreach or the threat of heavy penalties will incentivise platforms to remove content, switch off features, or withdraw services rather than defend lawful but messy debate.
“How will the state or tech companies monitor who’s who? By creating hoops that we all have to jump through, regardless of age. This is not just about those who are under 16 years old – we’ll all have to provide verification if we want to speak freely online. Censorship will be applied through compliance. The internet is the modern-day public square and and access to it should be made freer, not restricted.
“Vague and subjective definitions of ‘harm’ will be expanded and abused, increasing arbitrary takedowns and asymmetric enforcement. We accept that real harms occur online but jumping into unproven age-verification or digital-ID regimes is not the answer. It’s simply a sticking plaster that makes us feel good about a problem, but in the long run, it will create much bigger ones.
“Parliament should reject heavy-handed approaches and instead advocate for more education for young people and parents to increase digital literacy. Other measures could include opt-in tools, civic education, community initiatives, and better parental resources. Individuals should be equipped from the ground up, not limited by the state.”
It is highly likely that this Bill will cause more problems than it will solve. The drafting infelicities will hopefully be attended to as the Bill progresses.
But in the bigger picture, apart from the serendipitous coincidence of this Bill with the HDCA amendment, this Bill represents yet another step by the State to attempt to regulate the use of Internet content – another step towards Carbonek - the castle of the Holy Grail that I documented in my article The Holy Grail of Internet Content Control.




One of the saddest parts of the B416 bill is that you haven't pointed out all of the issues within it. While it is a relatively short bill, it provides no deatils as to why, how, or really even who it applies to. Its definition of the platforms it covers would include the most used places on the interenet including Reddit, YouTube, SubStack, and most online games, on top of the Tiktok and Instagrams it was intended to target. It is an unworkable beast, pushed by people who made the assumption that social media is the source of harm, not the people who use it.
The approach also seeks to require very little of the social media platforms themselves as to content, merely controlling access. A better approach would be to demand better from them, perhaps work on the use of algorithms, or removing them entirely, with accounts identified as being children.
As to controlling access, any person who has any idea of technolgy knows that it is a practical impossibility without creating severe implications on the rest of society. B416's submissions to the select committee talked about children using student IDs to identify as children, when the reality is it will be the rest of us needing to identify as adults. Do we trust social media companies, or third party companies, with our identification documents? I wouldn't, as these are companies that make profit of knowing who we are, and we'll be providing them with more. Of course it all won't matter, as children will teach eachother how to VPN, probably using free service provdiers who put their data even more at risk.
Glad to see Laura McClure's deepfake bill drawn. That is something that should have received backing immediately, and not have to sit in the biscuit tin. It would have been an easy win for the government, fixing something that I raised, along with others, to the select committee years ago.
Being able to identify people is a key component of all this. Brought to you by the Covid app...https://www.dia.govt.nz/press.nsf/d77da9b523f12931cc256ac5000d19b6/bdccdd5cd7d17e88cc258cd100133def!OpenDocument As always the problem with big government is its self propelling nature.