You can’t libel the dead. But that doesn’t mean you should deepfake them.



Zelda Williams, daughter of the late actor Robin Williams, has a poignant message for her father’s fans.

“Please, just stop sending me AI videos of Dad. Stop believing I wanna see it or that I’ll understand. I don’t and I won’t,” she wrote in a post on her Instagram story on Monday. “If you’ve got any decency, just stop doing this to him and to me, to everyone even, full stop. It’s dumb, it’s a waste of time and energy, and believe me, it’s NOT what he’d want.”

It’s probably not a coincidence that Williams was moved to post this just days after the release of OpenAI’s Sora 2 video model and Sora social app, which gives users the power to generate highly realistic deepfakes of themselves, their friends, and certain cartoon characters.

That also includes dead people, who are seemingly fair game because it is not illegal to libel the deceased, according to the Student Press Law Center.

Sora will not let you generate videos of living people — unless if it is of yourself, or a friend who has given you permission to use their likeness (or “cameo,” as OpenAI calls it). But these limits don’t apply to the dead, who can mostly be generated without roadblocks. The app, which is still only available via invite, has been flooded with videos of historical figures like Martin Luther King, Jr., Franklin Delano Roosevelt, and Richard Nixon, as well as deceased celebrities like Bob Ross, John Lennon, Alex Trebek, and yes, Robin Williams.

How OpenAI draws the line on generating videos of the dead is unclear. Sora 2 won’t, for example, generate former President Jimmy Carter, who died in 2024, or Michael Jackson, who died in 2009, though it did create videos with the likeness of Robin Williams, who died in 2014, according to TechCrunch’s tests. And while OpenAI’s cameo feature allows people to set instructions for how they appear in videos others generate of them — guardrails that came in response to early criticism of Sora — the deceased have no such say. I’ll bet Richard Nixon would be rolling over in his grave if he could see the deepfake I made of him advocating for police abolition.

Deepfakes of Richard Nixon, John Lennon, Martin Luther King, Jr., and Robin Williams
Deepfakes of Richard Nixon, John Lennon, Martin Luther King, Jr., and Robin WilliamsImage Credits:Sora, screenshots by TechCrunch

OpenAI did not respond to TechCrunch’s request for comment on the permissibility of deepfaking dead people. However, it’s possible that deepfaking dead celebrities like Williams is within the firm’s acceptable practices; legal precedent shows that the company likely wouldn’t be held liable for the defamation of the deceased.

Techcrunch event

San Francisco
|
October 27-29, 2025

“To watch the legacies of real people be condensed down to ‘this vaguely looks and sounds like them so that’s enough,’ just so other people can churn out horrible TikTok slop puppeteering them is maddening,” Williams wrote.

OpenAI’s critics accuse the company of taking a fast-and-loose approach on such issues, which is why Sora was quickly flooded with AI clips of copyrighted characters like Peter Griffin and Pikachu upon its release. CEO Sam Altman originally said that Hollywood studios and agencies would need to explicitly opt out if they didn’t want their IP to be included in Sora-generated videos. The Motion Picture Association has already called on OpenAI to take action on this issue, declaring in a statement that “well-established copyright law safeguards the rights of creators and applies here.” He has since said the company will reverse this position.

Sora is, perhaps, the most dangerous deepfake-capable AI model accessible to people so far, given how realistic its outputs are. Other platforms like xAI lag behind, but have even fewer guardrails than Sora, making it possible to generate pornographic deepfakes of real people. As other companies catch up to OpenAI, we will set a horrifying precedent if we treat real people — living or dead — like our own personal playthings.




Source