SAG President Fran Drescher slams ‘AI fraudsters’ as congressional bill on deepfakes receives massive support

A new bill on artificial intelligence deepfakes introduced by a bipartisan group of senators is bringing together actors, studios, and tech companies.

The No Fakes Act, led by Democratic Senator Chris Coons of Delaware, is a revised version of a previous discussion draft that was introduced last fall taking aim at digital deepfakes and protecting actors’ (and the average citizen’s) likenesses.

“Game over A.I. fraudsters! Enshrining protections against unauthorized digital replicas as a federal intellectual property right will keep us all protected in this brave new world,” SAG-AFTRA President Fran Drescher said in a statement on the union’s website. “Especially for performers whose livelihoods depend on their likeness and brand, this step forward is a huge win!”

She thanked Senator Coons, as well as other backers of the bill including Sens. Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC).

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

SAG-AFTRA National Executive Director and Chief Negotiator Duncan Crabtree-Ireland told Fox News Digital, “I think that it was always the vision of Senator Coons, for example, and certainly it was our vision that all of the major stakeholders should be consulted in the process before the bill was formally introduced, because it’s so difficult to get legislation moved in Washington, especially right now. And we felt like if all of the concerns and issues could be really heard, then we’d have the best chance for getting something enacted. And from our point of view, this is absolutely crucial. The timing is now, and it’s desperately needed.”

The Motion Picture Association, which represents multiple major studios including Netflix, Sony, Paramount, Universal, Disney and Warner Bros, also praised the bill.

“We support protecting performers from generative AI abuse – and this bill thoughtfully establishes federal protections against harmful uses of digital replicas, while respecting First Amendment rights and creative freedoms,” MPA Chair and CEO Charles Rivkin said in a statement on the organization’s website.

SCARLETT JOHANSSON REFUSED OPENAI JOB BECAUSE ‘IT WOULD BE STRANGE’ FOR HER KIDS, ‘AGAINST MY CORE VALUES’

The MPA had initially been hesitant about the wording of the original bill, saying in a statement last year when it was introduced that they looked forward to working with the senators on the bill “without infringing on the First Amendment rights and creative freedoms upon which our industry depends.”

Crabtree-Ireland said, “I think getting the MPA, the RIAA had been on board from early days. … I think it was a combination of all of those factors that have gotten us to what I see as an unprecedented level of support for any legislation that impacts the entertainment industry.”

AI expert Marva Bailer noted that the tech companies, like OpenAI and IBM, also have a stake in backing the bill.

WATCH: AI EXPERT EXPLAINS WHY TECH COMPANIES ARE ON BOARD WITH AI LEGISLATION BACKED BY HOLLYWOOD

CLICK HERE TO SIGN UP FOR THE ENTERTAINMENT NEWSLETTER

“What might surprise some people is that the technology companies, alongside the motion picture organizations, professional associations and creators, are actually for this bill,” she told Fox News Digital. “So, why would an Open AI or Disney or an IBM Alliance WatsonX, why would they be interested? Well, it’s because it’s going to put some guardrails around the established market. And what’s happening with these deepfakes is people are creating a substitute market. And this substitute market has no rules and no monetization.”

Coons’ website summarizes the bill, explaining it would “hold individuals or companies liable for damages for producing, hosting, or sharing a digital replica of an individual performing in an audiovisual work, image, or sound recording that the individual never actually appeared in or otherwise approved – including digital replicas created by generative artificial intelligence (AI).”

NICOLAS CAGE TERRIFIED AI IS GOING TO STEAL HIS BODY, ‘DO WHATEVER THEY WANT WITH IT’

The proposed penalties include a $5,000 fine plus damages and removal of the digital replica. Civil action can also be brought against a perpetrator, under which they can be liable for $5,000 per work containing the unauthorized replica if they’re an online service, and $25,000 for a non-online perpetrator, such as a studio.

“I think the most important thing is to actually get something enacted, because this is a problem that’s affecting people right now, and it’s really very real” Crabtree-Ireland said. “And I’ve talked to dozens of our members who’ve been personally impacted by it. I’ve been personally impacted by it. I myself was deepfaked last year during our contract ratification process for the at the end of the TV theatrical strike last year, someone made a video of me saying false things about the contract, urging people to vote against the contract that I had negotiated. They put this out on social media like Instagram, and tens of thousands of people saw it, and there was no way to un-ring that bell. So I think there is a need that is certainly our members face and also people far beyond.”

WATCH: LEGAL EXPERT EXPLAINS WHAT CHANGED IN NO FAKES ACT TO BRING STUDIOS ON BOARD

Rosenberg cited the recent case of a school principal in Maryland allegedly being framed for making racist comments by an AI deepfake, highlighting that the bill is “not just limited to celebrities.”

“So you don’t necessarily have to establish that somebody has a commercial value in their voice or a commercial value in their likeness in order to be covered by this act.”

MORGAN FREEMAN CALLS AI DEEPFAKE A ‘SCAM’ AFTER HIS VOICE IS REPLICATED ON TIKTOK

He added, “The stories that we hear about with deepfakes are certainly the bad stories and the stories that are affecting individuals. But I do agree that there is a lot that is good. And by having these safe harbor type provisions, it allows the technology to continue to develop and grow.” 

Bailer felt similarly, saying the importance of the bill is in establishing “transparency.”

“So no one’s saying, ‘Oh, AI’s not going to happen. We have to stop it.’ They want to understand the transparency of where AI is being used, and they want the permissions. And what we really need to be watching out for is the substitute marketplace. And what that means is we’re seeing these actual brands where they are licensing through contracts, their image and likeness to have the Elvis experience and the Kiss experience and the ABBA experience. And it’s very exciting.” 

While guardrails have been put in place for writers and actors after last year’s strikes, SAG-AFTRA is still contending with the impact of AI across other forms of entertainment.

LIKE WHAT YOU’RE READING? CLICK HERE FOR MORE ENTERTAINMENT NEWS

The union is currently on strike on behalf of members who perform in video games after over a year and a half of negotiations. 

“Although agreements have been reached on many issues important to SAG-AFTRA members, the employers refuse to plainly affirm, in clear and enforceable language, that they will protect all performers covered by this contract in their A.I. language,” SAG-AFTRA’s website states.

WATCH: SAG-AFTRA REP ON WHY THE ‘DEVASTATING’ HOLLYWOOD STRIKES LAST YEAR WERE ‘NECESSARY’

Regarding the current strike, Crabtree-Ireland said he hopes the No Fakes bill, if passed, will add to what he calls a “mosaic of protection.”

Reflecting on the past strike, which shut down Hollywood for almost six months last year, Crabtree-Ireland said, “Our members suffered. Other workers in the industry suffered. The industry suffered. It was necessary at that time. I wish it hadn’t been. I mean, to me, when I look at the ultimate agreement, I feel like the companies could have made this deal with us on July 12th, and this entire thing could have been avoided, and yet they refused. And so that’s very frustrating. On the other hand, it was essential that we be out ahead of the implementation of AI. If we were trying to negotiate this after the industry had already started using it in a big way, it would be impossible to actually sort of put that genie back in the bottle. And so I feel really good that we successfully anticipated this challenge.”

He added, “It’s an existential battle, and that’s why we’re fighting it right now with the video game companies. Because if we wait three years, it will be too late. This will have gone too far, and we won’t be able to roll it back. So, this is a fight for the future of our members’ careers and even more fundamental than that.”