The FCC wants to make robocalls that use AI-generated voices illegal


The rise of AI-generated voices impersonating celebrities and politicians could make the Federal Communications Commission’s job even more difficult (FCC) to fight robocalls and prevent people from being spammed and scammed. That’s why FCC Chairwoman Jessica Rosenworcel wants the commission to formally recognize calls that use AI-generated voices, which would make it illegal to use voice-cloning technologies in robocalls. Under the FCC’s Telephone Consumer Protection Act (TCPA), residential solicitations that use an artificial voice or voice recording are against the law. whom TechCrunch notes, FCC proposal it will make it easier to go after and indict bad actors.

“AI-generated voice cloning and images are already confusing consumers into thinking fraud and scams are legitimate,” said FCC Chairwoman Jessica Rosenworcel. “No matter which celebrity or politician you prefer, or how you feel about your relatives when they reach out for help, we can all be targeted by these fake calls.” If the FCC deems AI-generated voice calls illegal under existing law, the agency could give Attorney Generals’ Offices across the country “…new tools they can use to fight fraud and protect consumers.”

The FCC’s proposal comes shortly after some New Hampshire residents received calls from what appeared to be President Joe Biden telling them not to vote in their state’s primary. The security firm thoroughly analyzed the call and determined that it was generated using artificial intelligence tools by a startup called ElevenLabs. Information was provided by the company banned the account is responsible for the message impersonating the president, but the incident may be just one of many attempts to disrupt the upcoming US election using AI-generated content.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *