March 13, 2024

FCC Proposes Ban on AI-Generated Robocalls

FCC Proposes Ban on AI-Generated Robocalls

Yet another robocall threat has emerged to prey on the public: AI-generated voices in robocalls. These artificial voices mimic celebrities, politicians, and family members to gain the call recipient’s trust and convince them to give money to the callers or take potentially harmful action.

Dangers of AI-Generated Robocalls

This latest tactic is especially effective since consumers recognize the voices, believing the call must be legitimate. After all, would George Clooney steer you wrong? What about your aunt? If the call sounds legitimate, many call recipients let their guard down. They may believe their first cousin needs emergency money or that they should contribute to a nonexistent cause.

The problem has escalated in recent months. The FCC recently proposed strong legal action against these calls to protect the American public. However, the current situation is still a major threat to consumers who need more information on this scam.

FCC Action Against Voice Cloning Technology

The government has recently focused its attention on these calls. Jessica Rosenworcel, FCC Chairwoman, announced on January 31, 2024, that the Commission believed AI-generated voices fit the definition of “artificial voices” in the Telephone Consumer Protection Act (TCPA). As such, these robocalls are already illegal, which should make taking action to stop them much easier. As a result of a unanimous vote, the FCC proposed making this voice cloning technology illegal. As Rosenworcel noted, “. . . it is possible we could all be a target of these faked calls.”

State Cooperation

The FCC believes cooperation with individual states is key to controlling these robocalls. The Commission has received opinions from 26 State Attorneys General on how to protect consumers from AI-generated voice robocalls. In addition to consulting with individual state governments, the FCC is working to give them better tools to battle this new robocall threat. This effort aligns with the existing Memorandum of Understanding the FCC has with 48 states to fight robocalls of all types.

The Catalyst

Authorities were already concerned about the use of these AI-generated calls last year. They became more so after a January campaign robocall featuring a voice sounding like President Biden reached potential voters before the New Hampshire primary election. The implications of this call were alarming as they could potentially affect the presidential election and other national and down-ballot contests.

While the public is incredibly suspicious of robocalls, a familiar voice could overcome their reluctance and convince them to donate money, vote a certain way, or take other harmful actions. This New Hampshire scam brought the AI-voice threat into clear focus.

Impact on Businesses

AI-generated voices on robocalls are a frightening development that has far-reaching effects. Voice cloning endangers multiple industries, including the following:

Call Centers/Customer Service

Customer service centers are the main line of communication with customers. Sadly, AI-voice cloning can mimic the voices of real customer service agents, allowing bad actors to launch phishing schemes and other scams while posing as familiar company representatives. Consumers can no longer “believe their ears” when they receive an automated call. This technology makes them even more vulnerable to phone scams than in recent years.

Financial Companies

Your banker and your investment broker are also targets of AI-generated calls. The financial industry uses voice-based authentication systems to access certain accounts and verify some banking transactions. AI-generated voices make these systems vulnerable to criminals, which can hugely impact financial companies and individual accounts. The amount of financial risk is enormous.

Law Enforcement

Even law enforcement agents are not safe from this technology. AI-generated voices can imitate officers and other officials, allowing criminals to breach security measures, confuse legal proceedings, and potentially change audio evidence. Many court proceedings rely on recorded evidence, so the potential for judicial interference is great.

The Media

Misinformation and fake news are already huge cultural problems. Voice cloning makes them more difficult to fight. Robocalls using this technology can more easily spread misinformation by faking the voices of respected public figures. It also means that reporters will have more difficulty establishing their credibility. Audio recordings used to be the gold standard of authentication. That standard is changing, leaving media figures open to questions about their accuracy and intentions.

Ongoing Efforts to Thwart Robocalls

AI-generated voice robocalls are a true threat to many industries, including call centers. Members of the general public already mistrust unknown calls and will usually not answer them. They sometimes answer spoofed calls, believing they are from local companies or residents. The combination of spoofing and voice cloning will help defraud more people, further harming consumer’s trust in the call center industry.

You cannot wait for the states and the FCC to finally end the scourge of robocalls and take action. Instead, work to protect your company’s reputation through increased agent training, working with trusted providers, call monitoring, and enhanced caller ID. Companies like Caller ID Reputation have the knowledge and technology to help you thrive in this fraud-filled environment. Use all the resources you can to maintain consumer trust in your company.