OpenAI pauses MLK deepfakes on Sora after ‘disrespectful’ videos.


OpenAI It was said Thursday night It has “paused” a Martin Luther King Jr. deepfake on its social app Sora after users created “disrespectful” AI-generated videos of the late civil rights leader. She said representatives or estates of other historical figures will now be able to opt out of the use of their images on the platform.

The company said it made this decision after complaints from King’s heirs and his daughter Bernice King, who asked people about the matter Social media To stop sending AI videos to her father. King is one of many deceased celebrities and historical figures whose likeness has appeared in Sora, often in… In rude, offensive and unpleasant ways.

So, at the request of King, Inc., OpenAI has paused generations depicting Dr. King while it works to strengthen the guardrails for the historical figures.

While there is a strong interest in freedom of expression in the depiction of historical figures, OpenAI believes that public figures and their families should ultimately have control over how their images are used. Authorized representatives or property owners can request that their images not be used in Sora’s cameo.

OpenAI’s changing attitude towards historical figures reflects its approach to copyright when Sora was first launched. Strategy It proved controversialand the platform rose awkwardly Return to the “Subscription” policy. Rights holders after being flooded with images of characters such as Pikachu, Rick and Morty, and SpongeBob SquarePants.

Unlike copyright, there is no federal framework for protecting images of people, but a variety of state laws allow people to file lawsuits over the unauthorized use of an image of a living person — and, in some states, an image of a deceased person as well. For example, California, where OpenAI is based, specifically mentions posthumous privacy rights Apply for exact copies of performers. As for live humans, OpenAI has allowed people to choose to appear in videos from the beginning by having them create AI clones of themselves.



Leave a Reply

Your email address will not be published. Required fields are marked *