Senators introduce bill to protect individuals against AI-generated deepfakes


A group of senators presented today , a law that makes it illegal to create digital recreations of a person’s voice or likeness without that person’s consent. It’s a bipartisan effort by Senators Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.) and Thom Tillis (RN.C.) called Nurture Originals, Foster Art. , and the Keeping Entertainment Safe Act of 2024.

If it passes, the NO FAKES Act would allow people to claim damages when their voice, face or body is recreated by artificial intelligence. Both individuals and companies will be liable for the production, posting or sharing of unauthorized digital copies, including those made by generative artificial intelligence.

We have already seen many celebrities impersonate themselves around the world. tricked people with a gift of fake Le Creuset cookware. A sound that resonates ChatGPT appeared in a voice demo. Artificial intelligence can also be used to make political candidates make false statements the most recent example. And it can’t just be celebrities .

“Everyone deserves the right to own and protect their own voice and likeness, whether you’re Taylor Swift or someone else,” Senator Coons said. “General artificial intelligence can be used as a tool to foster creativity, but it cannot come at the expense of unauthorized exploitation of anyone’s voice or likeness.”

The pace of new legislation lags behind the pace of new technological development, so it’s encouraging to see lawmakers taking AI regulation seriously. The act proposed today follows legislation recently passed by the Senate Act of DEFIANCEit will allow victims of deep sex fraud to sue for damages.

Several entertainment organizations have pledged their support for the NO FAKES Act, including SAG-AFTRA, the RIAA, the Motion Picture Association and the Recording Academy. Many of these groups are taking their own actions to protect against unauthorized AI. Recently SAG-AFTRA to try and secure a union agreement for likenesses in video games.

Even OpenAI is among the supporters of the act. “OpenAI is pleased to support the NO FAKES Act, which protects creators and artists from unauthorized digital copies of their sound and likeness,” said Anna Makanju, vice president of global affairs at OpenAI. “Creators and artists must be protected from illegal imitation, and thoughtful legislation at the federal level can make a difference.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *