The Online Safety Act 2021 (which came into effect on 23 January 2022) does grant the eSafety Commissioner significant powers over online content, including pornography [1].
The Online Safety Act 2021 empowers the eSafety Commissioner to issue "remedial notices" requiring online services to ensure that restricted access systems prevent children from accessing Class 2 material (which includes pornography) [2].
Section 108 of the Online Safety Act specifies that restricted access systems must "incorporate reasonable steps to confirm that an applicant is at least 18 years of age" [4].
The government's response in August 2023 stated it would "await the outcomes of the class 2 industry codes process before deciding on a potential trial of age assurance technologies" [7].
Following a National Cabinet meeting on 1 May 2024, the government announced it would fund a pilot of age assurance technology to test efficacy "including in relation to privacy and security" [8].
However, this is standard regulatory design in Australia - the ACCC chair, Privacy Commissioner, and many other regulators are similarly appointed by government [9].
**Regarding "mandatory" powers**: The eSafety Commissioner's actual powers are constrained.
The Commissioner can only:
- Issue remedial notices (not removal notices) for Class 2 material requiring implementation of restricted access systems
- The notices only apply to services "provided or hosted from Australia" [10]
- Services must comply with industry codes registered under the Act [11]
The Act's Basic Online Safety Expectations are **not enforceable in court** [12], and compliance is monitored through reporting requirements, not direct authority to mandate specific technologies [13].
Multiple US states have passed similar laws [16].
2. **Privacy safeguards in development**: The pilot's explicit inclusion of privacy and security testing contradicts the claim that these implications are not considered [17].
The eSafety Commissioner's roadmap specifically discussed privacy risks and recommended cautious rollout [18].
3. **Technology neutrality**: The Act does not mandate facial recognition specifically.
It allows multiple approaches - bank verification, account history analysis, and other methods are explicitly contemplated as alternatives to facial recognition [19].
4. **Parliamentary oversight**: The Online Safety Act 2021 is currently under statutory review (announced February 2024), providing a mechanism for parliamentary scrutiny of the eSafety Commissioner's powers [20].
The headline ("could bring in") indicates uncertainty.
**Digital Rights Watch** is a civil rights advocacy organization focused on digital privacy and freedom.
The organization characterized the Bill's provisions in alarmist terms (e.g., "the Bill introduces provisions for powers that are likely to undermine digital rights"), which reflects advocacy positioning rather than neutral analysis.
**Did Labor do something similar?**
The current Australian Labor government (since May 2022) has continued and expanded the Online Safety Act framework rather than opposed it.
* * * *
The Labor government:
- **Maintained the Act**: Did not repeal or significantly roll back Coalition-era online safety legislation
- **Expanded age verification initiatives**: Announced in May 2024 (under Labor) that it would fund a pilot of age assurance technology [21]
- **Commissioned research on social media age limits**: The Department of Infrastructure is undertaking research into potential age-limits for social media generally [22]
- **Supported class 2 industry codes development**: The Labor government has continued development of industry codes for pornography and age-inappropriate content [23]
In fact, the Labor government has moved *faster* and more decisively on age verification than the Coalition did.
This is consistent with policy direction in the UK (Conservative government), EU (multiparty consensus), and multiple US states (both Republican and Democratic controlled).
**The legitimate criticism**: The eSafety Commissioner does hold significant power as an unelected official, and there are genuine privacy concerns around age verification technologies, particularly facial recognition.
These concerns have been raised by:
- Digital Rights Watch (privacy advocacy) [24]
- Privacy advocates noting risks of data collection and storage [25]
- Technology experts warning about accuracy issues with facial recognition across demographic groups [26]
These are valid concerns that merit serious consideration.
**The government's response to these concerns**:
1.
- - Digital nounDigital Rights nounRights Watch nounWatch ( ( プライバシー nounPrivacy 擁護 nounYougo )[ )[ 24 noun24 ] ]
The Online Safety Act is structured as regulatory rather than absolute authority - the Commissioner works through industry codes rather than direct mandates [27]
2.
Parliament retains oversight through statutory review mechanisms [30]
**Why the claim is misleading**:
The claim presents decisions that are currently being made (age assurance pilot) as if they are already established mandatory policies.
The specific charge that officials "are not required to consider privacy implications" contradicts documented government statements explicitly evaluating privacy in the pilot [31].
The claim also conflates "powers granted to" with "decisions made by" - the Act gives the eSafety Commissioner power to require restricted access systems; the Commissioner has not yet mandated facial recognition and has explicitly recommended cautious, tested implementation.
**Broader context**: This is part of a global policy trend toward age verification for pornography, driven by child safety concerns.
Whether one agrees with this policy direction or not, it is neither unique to Australia nor unique to the Coalition government - Labor is pursuing it more aggressively.
The claim is partially accurate in identifying real powers granted under the Online Safety Act, but fundamentally misleading in three ways:
1. **Specificity error**: The Act does not mandate facial recognition; it requires "reasonable steps" for age verification using unspecified technology [32]
2. **Causation error**: The claim suggests privacy considerations are ignored, but government documents explicitly evaluate privacy and security as part of pilot design [33]
3. **Temporal error**: The claim presents potential future requirements as current policy; no mandatory age verification scheme currently exists in Australia [34]
The core concern about unelected regulatory power is legitimate and worth debating.
The claim is partially accurate in identifying real powers granted under the Online Safety Act, but fundamentally misleading in three ways:
1. **Specificity error**: The Act does not mandate facial recognition; it requires "reasonable steps" for age verification using unspecified technology [32]
2. **Causation error**: The claim suggests privacy considerations are ignored, but government documents explicitly evaluate privacy and security as part of pilot design [33]
3. **Temporal error**: The claim presents potential future requirements as current policy; no mandatory age verification scheme currently exists in Australia [34]
The core concern about unelected regulatory power is legitimate and worth debating.