US Senators Propose NO FAKES Act to Regulate Unauthorized AI Replicas
Summary:
A bipartisan group of U.S. senators has proposed the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, aiming to ban unauthorized AI-generated reproductions of people’s voices and images. The Act, if passed, will penalize companies or individuals creating such replicas. The legal ramifications also extend to platforms that intentionally host these illegitimate AI duplicates. However, certain uses backed by the First Amendment are permissible. This legislation emerges in the wake of a rise in AI-assisted creations in the creative industry and has sparked debates on intellectual property rights.
A proposed legislation by a number of senators across political lines seeks to outlaw unlicensed AI-generated recreations of people’s voices and appearances. Senators Chris Coons and Amy Klobuchar, both Democrats, and Republican Senators Marsha Blackburn and Thom Tillis, unveiled this discussion draft titled the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act on October 11. The Act seeks to impose penalties on companies or individuals who create unauthorized AI duplicates of any individual, alive or deceased, along with platforms that consciously host these unauthorized AI clones. Penalties are proposed to begin at $5,000 per violation.
The NO FAKES Act allows certain uses of unauthorized AI replicas that are safeguarded by the First Amendment, which includes replicas used for news reporting, documentaries, commentary, critique, academia, humor and parody.
Senator Coons voiced the growing call from creators nationwide, for Congress to formulate clear regulations governing the interaction and impact of generative AI. He asserted that Congress should seek to strike the apt balance between protecting individual rights, upholding the First Amendment, while also boosting AI innovation and creativity.
Senator Blackburn noted that this bill is a positive initial measure to safeguard songwriters, actors, and all U.S. creatives, stating that these individuals deserve to rightfully own their name, image, and likeness (NIL).
This draft legislation has been released at a time when AI-assisted songs are rapidly increasing, with hundreds of AI-imitated artists streaming their work on platforms such as YouTube and SoundCloud.
One such example includes a track titled “Heart on my sleeve” by an anonymous TikTok user dubbed “ghostwriter977”, features AI-produced vocals of artists Drake and The Weeknd. This track gained immense popularity earlier this year, with millions of views before it was taken down from the platform. Several paid services are currently offering AI technology to mimic the voices of musicians, actors and public figures.
AI-produced images have also sparked controversy in the Hollywood industry, often triggering strikes and disagreement within the actor community. The Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA) trade union has endorsed the proposal.
On an unfortunate note, the Alliance of Motion Picture and Television Producers (AMPTP) allegedly declined to “safeguard actors from being replaced by AI,” amongst other reasons, causing breakdown in the discussions with SAG-AFTRA on Oct. 11. However, a deal was struck between Writer’s Guild of America (WGA) and the AMPTP on Sept. 27, successfully ending a near five-month strike. This agreement outlines AI usage in writer’s rooms, and additionally provides for increased wages and balanced contracts.
Published At
10/13/2023 1:20:30 AM
Disclaimer: Algoine does not endorse any content or product on this page. Readers should conduct their own research before taking any actions related to the asset, company, or any information in this article and assume full responsibility for their decisions. This article should not be considered as investment advice. Our news is prepared with AI support.
Do you suspect this content may be misleading, incomplete, or inappropriate in any way, requiring modification or removal?
We appreciate your report.