Instagram’s New AI Age Detection Tool
Meta is planning to roll out an AI tool on Instagram designed to verify the ages of its users, addressing growing concerns about the platform's impact on young individuals.
Dubbed the ‘adult classifier’, this system aims to determine a user's true age based on various aspects of their profile, including their follower list, the content they engage with, and even birthday posts from friends.
Should the tool suspect that a user is under 18, they will automatically be assigned to a more restrictive version of the app, regardless of their declared age.
Stricter Controls for Teen Accounts
In September 2024, Meta introduced ‘teen accounts’ that come with heightened restrictions.
These accounts are private by default and limit who users can message and what types of content they can access.
Additionally, they feature set time reminders and parental controls that enable guardians to manage their children’s settings and monitor their interactions on the platform.
Meta has been proactive in its efforts to ensure the safety of younger users, but the new adult classifier is a more advanced approach to age verification.
Users may be required to validate their age with an ID if they attempt to change their birth date.
Alternatively, they could use a method called ‘social vouching’, in which three followers who are at least 18 years old can confirm their age.
Furthermore, Meta has partnered with the tech firm Yoti to allow users to verify their age using video selfies, adding another layer of scrutiny to age claims.
Regulatory Pressure Fuels Change
The urgency behind these developments stems from mounting regulatory and public pressure concerning the effects of Instagram on adolescents.
Reports reveal that since early 2019, Meta has received over 1.1 million notifications about underage users on Instagram, yet only a small fraction of these accounts have been disabled.
The Wall Street Journal highlighted that the company was aware of the significant mental health challenges faced by teenagers, particularly young girls, stemming from their interactions on the platform.
The stakes intensified in 2023 when 41 U.S. states filed a lawsuit against Meta, alleging that the company knowingly designed features that promote addiction among young users and contributed to a mental health crisis.
The lawsuit also raised concerns regarding compliance with the Children’s Online Privacy Protection Act (COPPA), accusing Meta of processing children's data without obtaining parental consent.
Proactive Measures and Public Backlash
The landscape of social media has shifted dramatically, with Meta under scrutiny for its role in the mental health struggles of teenagers.
Frances Haugen, a former Facebook employee, disclosed internal research indicating that Instagram could negatively impact the well-being of adolescent girls.
As a result, Meta has implemented a series of measures to enhance the safety of teenage users, which have garnered support from various advocacy groups.
The challenge remains for Meta to enforce these new age restrictions effectively.
A study by the UK's telecommunications regulator found that a third of minors on social media misrepresent their age, declaring themselves as 18 or older.
Given the ease with which users can falsify their age online, this task is far from straightforward.
Robust Verification Strategies in Development
To counteract age misrepresentation, Instagram plans to flag users who attempt to create new accounts with inconsistent birth dates linked to the same email address.
The platform can also analyse a device's unique ID to ascertain whether a new profile belongs to a previously registered user.
For those attempting to alter their listed age, Instagram will require proof in the form of a government-issued ID or a video selfie to be verified by Yoti.
This verification process aims to ensure that individuals are truthful about their age, while Meta has eliminated the social vouching option, previously available to users seeking to confirm their age through friends.
Appeals for Misclassified Users
While Meta is refining its adult classifier, users who are incorrectly categorised as minors will eventually have the option to appeal their status.
A spokesperson indicated that this appeal process is currently under development.
For the time being, wrongly classified individuals can manually adjust their account settings without needing parental consent.
The extensive data collected from users’ profiles allows Instagram to undertake this initiative.
The company acknowledges that teens misrepresenting their age has been less of a concern than initially anticipated.
However, with the rise of online threats and increasing scrutiny from regulators, the focus on creating a safer environment for young users has never been more critical.
The Role of App Stores in Age Verification
Meta’s executives have expressed the belief that age verification should extend to app stores, suggesting that this could be a more comprehensive solution.
However, representatives from Apple and Google have countered that such measures would violate data minimisation principles, with Google stating that there isn't a “single solution” to the issue.
Critics argue that this approach merely shifts the responsibility of age verification rather than solving the underlying problems of honesty and accountability among users.