ChatGPT is no longer taking its users at their word regarding their age.
In a move to bolster its safety protocols, OpenAI has introduced an AI model designed to predict whether a person behind the screen is a minor, regardless of the birthdate provided at sign-up.
The system, which began its global rollout on 20 January 2026, shifts the platform away from a traditional "honor system" toward an automated oversight model that monitors how people actually use the service.
By moving beyond simple text boxes, OpenAI is attempting to address long-standing safety concerns that have plagued the industry throughout 2024 and 2025.
How Does AI Guess Your Age?
The new system does not rely on ID documents upfront.
Instead, it scrutinises "behavioral signals" to estimate a user’s age.
This includes a detailed analysis of how long an account has existed, what time of day it is most active, and specific usage patterns that develop over time.
OpenAI explained that “deploying age prediction helps us learn which signals improve accuracy, and we use those learnings to continuously refine the model over time.”
When the algorithm suspects a user is under 18, ChatGPT automatically triggers a more restrictive experience.
These safeguards are designed to block exposure to graphic violence, sexual or romantic roleplay, and depictions of self-harm.
If the system is unsure, it defaults to these stricter settings to ensure maximum protection for potential minors.
Can Behavioral Patterns Be Trusted?
While OpenAI views this as a necessary safety layer, digital rights advocates are skeptical about the accuracy of inferring age from metadata.
Experts have pointed out that distinguishing between a teenager and a young adult based on habits is notoriously difficult.
Aliya Bhatia, a senior policy analyst at the Center for Democracy and Technology, noted,
“It's not easy to distinguish between an educator using ChatGPT to help teach math and a student using ChatGPT to study.”
Data from the 2024–2025 school year highlights this overlap, with 85% of teachers and 86% of students reporting regular use of AI tools.
Critics argue that early adopters and students using the tool for academic support could easily be misclassified.
Bhatia added,
“Just because a person uses ChatGPT to ask for tips to do math homework doesn’t make them under 18,”.
Restoring Access Through Biometric Verification
For adults who find themselves trapped in the "teen mode" by a faulty prediction, the only way out is a more invasive verification process.
Users must submit a live selfie to Persona, a third-party identity service, to prove their adulthood.
This reliance on external vendors has raised its own set of privacy fears.
In October 2025, a separate third-party vendor used by Discord suffered a breach that exposed 70,000 government IDs, leaving users wary of where their biometric data ends up.
OpenAI has not detailed exactly how long these verification documents will be retained but insists that this process is the most reliable way to correct the algorithm’s inevitable mistakes.
Why Is OpenAI Acting Now?
The pressure to implement these changes is not just about internal safety goals; it is a response to intense legal and regulatory scrutiny.
In September 2025, the Federal Trade Commission (FTC) issued orders to OpenAI, Alphabet, Meta, and xAI, demanding transparency on how they manage harmful interactions with children.
Furthermore, the company has been hit by lawsuits from parents alleging that the chatbot failed to intervene or even encouraged teens during moments of psychological distress.
J.B. Branch, a tech accountability advocate at Public Citizen, suggested the move is largely defensive.
“These companies are getting sued left and right for a variety of harms that have been unleashed on teens, so they definitely have an incentive to minimize that risk,” Branch said. Beyond safety, the age-prediction rollout serves as a precursor to an anticipated "adult mode."
By perfecting its ability to filter out minors now, OpenAI is laying the groundwork to offer more mature content to verified adults in the future.