The ruling, which the FCC unanimously adopted on Feb. 2, gives state attorneys general “new tools” to crack down those who use voice-cloning technology to perpetrate robocall scams, Rosenworcel added.
While robocall scams using AI-generated voices were already considered illegal, Thursday’s ruling clarifies that generating a voice with AI for a robocall is illegal in itself, according to the FCC.
AI-generated voice technology is becoming increasingly sophisticated, with the ability to create voices that are strikingly realistic. The technology has also made it easier and cheaper to perpetrate phone scams.
The technology’s rising prevalence was on display before January’s New Hampshire primary, when voters received calls from a voice impersonating Biden. The voice called the election “a bunch of malarkey” and urged voters to “save your vote for the November election.” Biden was not on the ballot in that primary, but a group of Democrats had organized a write-in campaign to show support for the president.
New Hampshire Attorney General John Formella (R) this week announced a criminal investigation into a Texas-based company suspected of being behind the thousands of calls to his state’s voters. And he issued a warning to others who may seek to use the technology to interfere with elections.
“Don’t try it,” he said. “If you do, we will work together to investigate, we will work together with partners across the country to find you, and we will take any enforcement action available to us under the law. The consequences for your actions will be severe.”