According to Decrypt, the UK government is considering using artificial intelligence (AI) for age verification on adult websites as part of the Online Safety Act 2023. The act, which received Royal Assent on October 26, aims to create a safer internet for UK users, particularly children, by mandating service providers to implement effective age verification methods. One of the six proposed methods involves users taking selfies and sending them to the government for AI verification of their adult status.
The proposal does not specify which AI tools or techniques would be used, only emphasizing reliability and that age assurance methods with a degree of variance have been suitably tested. The method must also be derived from a trustworthy source, according to Ofcom, the UK's telecoms regulator. Other methods outlined in the draft document include credit card verification, photo-ID matching, digital identity wallets, and mobile network operator checks. Ofcom is tasked with preventing children from accessing pornographic content and ensuring adults have unhindered access to legal content.
However, the government is unsure whether AI will effectively solve the problem, and Ofcom expects service providers to continually refine and update their methods to ensure accessibility, interoperability, and adherence to data protection laws. The UK must now grapple with the intersection of innovation and privacy as the use of selfies for age verification raises concerns about surveillance and control. Ofcom's Chief Executive, Dame Melanie Dawes, emphasized the importance of protecting children while safeguarding privacy rights and freedoms for adults to access legal content. Final guidance is expected to be published in early 2025.