Tech

Listen to this “Biden” call to voters. No wonder the FCC is cracking down on AI robocalls.


Some voters in New Hampshire received a call that sounded a lot like President Joe Biden. The call encouraged New Hampshire residents to stay home during last week’s primary election to “save their votes” for the November general election.

Of course, this makes no sense. Voters can vote in both elections. Why would Biden tell them such a thing? Well, that’s because he doesn’t.all these are AI voice generated robocalls Sounds like Biden.you can listen to one here,politely telegraph:

This is just a real-life example of how artificial intelligence can be weaponized by bad actors, which may be a big reason why the FCC wants to take action against AI-generated calls now.

FCC proposes ban on artificial intelligence robocalls

FCC Chairman Jessica Rosenworcel issued a statement on Wednesday announcing her proposal that the FCC treat AI-generated calls as “artificial” voices under the Telephone Consumer Protection Act (TCPA). By doing so, the FCC would make artificial intelligence-generated robocalls illegal.

The TCPA is often used by the FCC to restrict spam calls that consumers receive from telemarketers. Under the law, the use of artificial or prerecorded voice messages and automatic telephone dialing systems is prohibited.

“Artificial intelligence-generated voice clones and images are already causing confusion by deceiving consumers into thinking scams and fraud are legitimate,” Rosenworcel said in a statement. The statement continued:

“No matter which celebrity or politician you like, or what your relationship is with your relative when they ask for help, we all have the potential to be targeted by these fake calls. That’s why the FCC is taking steps to recognize this emerging “This behavior is illegal under current law, so we are providing our partners in state attorneys general offices across the country with new tools they can use to combat these scams and protect consumers.”

The timing of Rosenworcel’s statement appears to indicate that Biden’s robocalls have raised concerns about how these artificial intelligence-generated voices could be used in telemarketing scams and potentially election fraud.

So far, the only real steps to prevent the worst-case scenarios caused by AI-generated speech are taken by the AI ​​companies themselves. So far, the only real steps to prevent the worst-case scenario of AI-generated speech are being taken away by the AI ​​companies themselves. Bloomberg reportArtificial intelligence company ElevenLabs last week suspended users who created Biden robocalls on its platform.

“We are committed to preventing the misuse of audio AI tools and take any incidents of misuse extremely seriously,” ElevenLabs said in a statement.

However, as we recently saw with non-consensual AI-generated pornographic images Taylor Swiftsome people in the industry may have different feelings from ElevenLabs when it comes to the use of artificial intelligence products.





Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button