According to CryptoPotato, the Federal Communications Commission (FCC) has officially prohibited the use of artificial intelligence-generated voices in unwarranted robocalls across the United States. This decision comes after an incident in which New Hampshire residents received fabricated voice messages mimicking U.S. President Joe Biden, advising against participation in the state’s primary election. The ban, implemented under the Telephone Consumer Protection Act (TCPA), aims to curb the proliferation of robocall scams.
FCC Chairwoman Jessica Rosenworcel stated that bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. The latest ruling extends the prohibition to cover “voice cloning technology,” effectively stopping an essential tool used by scammers in fraudulent schemes. The TCPA aims to protect consumers from intrusive communications and “junk calls” by imposing restrictions on telemarketing practices, including using artificial or pre-recorded voice messages.
In a related development, authorities have traced a recent high-profile robocall incident imitating President Joe Biden’s voice back to a Texas-based firm named Life Corporation and an individual identified as Walter Monk. Attorney General Mayes has since sent a warning letter to the company. Attorney General John Formella has also confirmed that a cease-and-desist letter has been issued to the company, and a criminal investigation is underway. The robocall, circulated on January 21 to thousands of Democratic voters, urged recipients to abstain from voting in the primary election to preserve their votes for the subsequent November election.