At the FCC monthly meeting last week, the FCC voted to adopt a Declaratory Ruling that says that calls made with AI-generated voices are deemed to be “artificial” as defined by the Telephone Consumer Protection Act (TCPA). As such, the FCC has declared that using cloned voices to make calls to the public is illegal. In an interesting step, the ruling is effective immediately – most FCC orders are effective at some time after a ruling.
An AI-generated voice used for political purposes was recently in the news when somebody used a cloned Joe Biden voice in New Hampshire to call potential voters with the advice to stay home and not vote in the primary election. This ruling was not made in response to that particular call and has been under investigation at the FCC for several months. But that call is a great example of why the FCC has been investigating the issue.
The FCC ruling hopes to help protect the public from an expected deluge of AI-generated calls used for marketing. AI can be used to mimic the voices of celebrities, sports stars, politicians, or even family members to try to generate more effective marketing, political, or scam calls.
Scammers of all sorts have already latched onto the new AI technology, and this new ruling gives law enforcement the ability to crack down on people making scam or junk calls using AI technology.
Under the TCPA law, the FCC can also heavily fine companies that use AI technology to call people. The TCPA already makes it illegal to call consumers using automatic dialing systems or using artificial or prerecorded voice messages. This is the same law that created the Do Not Call List and requires marketers to have a pre-existing relationship with a customer or get explicit permission before calling the public for marketing purposes.
The FCC has been putting a lot of effort over the last year into squelching junk and scam calls. 48 State Attorney Generals have agreed to work with the FCC to tackle and prosecute those who make robocalls.
n August 2023, the FCC issued a $300 million fine against a company that had made over 5 billion calls in just three months trying to sell auto warranties. The Attorney General of Ohio worked with the FCC to shut down that operation, which led to a 99% reduction in such calls. But for every spam caller shut down, others seem to take their place since spam calls obviously generate enough revenues to justify the effort. The FCC also recently threatened to shut down carriers that knowingly assist those who make spam calls.
Unfortunately, a lot of spam calling now originates from overseas, and that just makes it harder to find the source and shut it down. However, the large carriers in the industry have been putting a lot of effort into killing junk calls since that is the number one complaint they continue to receive from customers.
This ruling has to make a lot of companies nervous who have been planning on using AI-generated calls to communicate with customers. This ruling would suggest that it’s illegal for a company to use AI-calls that mimic celebrity voices, even for existing customers. A grayer area is using AI-generated voices in any manner to sell to customers – something that most marketing departments have been planning. Even grayer is using AI-generated voice to handle tasks like scheduling or answering questions for customers.