Technology

Deepfake Videos of Eerie Tom Cruise Revive Debate – The New York Times

To those fearful of a future in which videos of real people are indistinguishable from computer-generated forgeries, two recent developments that attracted an audience of millions might have seemed alarming.

First, a visual effects artist worked with a Tom Cruise impersonator to create startlingly accurate videos imitating the actor. The videos, created with the help of machine-learning techniques and known as deepfakes, gained millions of views on TikTok, Twitter and other social networks in late February.

Then, days later, MyHeritage, a genealogy website best known for its role in tracking down the identity of the Golden State Killer, offered a tool to digitally animate old photographs of loved ones, creating a short, looping video in which people can be seen moving their heads and even smiling. More than 26 million images had been animated using the tool, called Deep Nostalgia, as of Monday, the company said.

The videos renewed attention to the potential of synthetic media, which could lead to significant improvements in the advertising and entertainment industries. But the technology could also be used — and has been — to raise doubts about legitimate videos and to insert people, including children, into pornographic images.

The creators of the viral Tom Cruise TikToks said the expertise required to use the technology makes abusing it much harder, and the company behind the photo-animating tool said it put in place safeguards to prevent misuse. Experts say the two examples are not overly alarming — but that they raise questions about the future of the technology that should be considered while it is still in its relative infancy.

“Although Deep Nostalgia itself is innocuous, it’s part of this set of tools that are potentially very threatening,” said Sam Gregory, the program director of Witness, a nonprofit organization focused on the ethical use of video, and an expert on artificial intelligence.

The digital imitation of Mr. Cruise was no easy feat. Chris Ume, the visual effects artist in Belgium who created the videos, said in an interview that they required extensive expertise and time.

Most of what you see in the videos is the body and voice of Miles Fisher, a Tom Cruise impersonator who was already fluent in the actor’s mannerisms and sounds and who bears a strong resemblance even without the manipulations. Only the face, from the forehead to the chin, of the real Tom Cruise is shown in the videos.

He spent two months training his computer model to create Mr. Cruise’s facial expressions, first feeding it video of random faces before focusing on Mr. Cruise. Mr. Ume spent about 24 hours in production for each minute-long video, fine-tuning details like the eye alignment.

Even if the technology improves, videos like his would require extensive manual work and a skilled impersonator, he said.

“It’s like a small Hollywood studio with the two of us,” he said. “It’s not something you can do at a home computer, pressing a button.”

The Deep Nostalgia tool was created for MyHeritage by D-ID, an artificial intelligence company based in Tel Aviv. Gil Perry, the chief executive of D-ID, said that the company works only with partners it can trust not to abuse the technology, and that it had a four-year relationship with MyHeritage.

Videos created using the tool have watermarks to indicate that they aren’t real, and the videos do not include audio, a decision that Mr. Perry said makes it harder to use them for unsavory purposes.

He said the technology that powered Deep Nostalgia was “just the tip of the iceberg of what we’re capable of doing.”

“The potential for the good part of this technology is endless,” he said.

When optimists talk about the possible strengths of the technology, they often point to its uses in advocacy, where it can put a face to issues and create deeper emotional connections.

A nongovernmental organization created a video of Javier Arturo Valdez Cárdenas, a Mexican journalist who was murdered in 2017, in which he appeared to call for justice for his own murder. The parents of Joaquin Oliver, a 17-year-old who was murdered in the mass shooting at a high school in Parkland, Fla., in 2018, digitally resurrected him for a video promoting gun safety legislation. The police in the Australian state of Victoria used a police officer who died by suicide in 2012 to deliver a message about mental health support.

And “Welcome to Chechnya,” a documentary released last year about anti-gay and lesbian purges in Chechnya, used the technology to shield the identity of at-risk Chechens.

The effects could also be used in Hollywood to better age or de-age actors, or to improve the dubbing of films and TV shows in different languages, closely aligning lip movements with the language onscreen. Executives of international companies could also be made to look more natural when addressing employees who speak different languages.

But critics fear the technology will be further abused as it improves, particularly to create pornography that places the face of one person on someone else’s body.

Nina Schick, the author of “Deepfakes: The Coming Infocalypse,” said the earliest deepfaked pornography took hours of video to produce, so celebrities were the typical targets. But as the technology becomes more advanced, less content will be needed to create the videos, putting more women and children at risk.

A tool on the messaging app Telegram that allowed users to create simulated nude images from a single uploaded photo has already been used hundreds of thousands of times, according to BuzzFeed News.

“This will become an issue that can affect everyone, especially those who don’t have resources to protect themselves,” Ms. Schick said.

The technology could also have a destabilizing effect on global affairs, as politicians claim that videos, including genuine ones, are fake in order to gain an advantage that the law professors Robert Chesney and Danielle Citron have called the “liar’s dividend.”

In Gabon, opposition leaders argued that a video of President Ali Bongo Ondimba giving a New Year’s address in 2019 was faked in an attempt to cover up health problems. Last year, a Republican candidate for a House seat in the St. Louis area claimed that the video of George Floyd’s death in police custody had been digitally staged.

As the technology advances, it will be used more broadly, according to Mr. Gregory, the artificial intelligence expert, but its effects are already pronounced.

“People are always trying to think about the perfect deepfake when that isn’t necessary for the harmful or beneficial uses,” he said.

In introducing the Deep Nostalgia tool, MyHeritage addressed the issue of consent, asking users to “please use this feature on your own historical photos and not on photos featuring living people without their permission.” Mr. Ume, who created the deepfakes of Mr. Cruise, said he had no contact with the actor or his representatives.

Of course, people who have died can’t consent to being featured in videos. And that matters if dead people — especially celebrities — can be digitally resurrected, as the artist Bob Ross was to sell Mountain Dew, or as Robert Kardashian was last year in a gift to his daughter Kim Kardashian West from her husband, Kanye West.

Henry Ajder, a deepfakes researcher, imagined a future in which our own voices could be used with assistants like Amazon’s Alexa, allowing us to stay connected with loved ones after our deaths. Or, as posited in an episode of “Black Mirror,” whole aspects of our personalities could be simulated after death, trained by our voices on social media.

But that raises a tricky question, he said: “In what cases do we need consent of the deceased to resurrect them?”

“These questions make you feel uncomfortable, something feels a bit wrong or unsettling, but it’s difficult to know if that’s just because it’s new or if it hints at a deeper intuition about something problematic,” Mr. Ajder said.