News > Software & Apps Faking Videos is Easy, Deep Nostalgia Shows Just click and animate people By Sascha Brodsky Sascha Brodsky Senior Tech Reporter Macalester College Columbia University Sascha Brodsky is a freelance journalist based in New York City. His writing has appeared in The Atlantic, the Guardian, the Los Angeles Times and many other publications. lifewire's editorial guidelines Updated on March 5, 2021 03:09PM EST Fact checked by Rich Scherr Fact checked by Rich Scherr Twitter University of Maryland Baltimore County Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. lifewire's fact checking process Tweet Share Email Tweet Share Email Software & Apps Mobile Phones Internet & Security Computers & Tablets Smart Life Tech Leaders Home Theater & Entertainment Software & Apps Social Media Streaming Gaming Women in Gaming Key Takeaways Deep Nostalgia is a new program that allows you to animate old photos.The technology shows just how easy it is to create videos of people doing things they actually haven’t done in real life.Deep fake technology is already so sophisticated that it’s hard to tell if a video is real or computer-generated, one expert says. imaginima / Getty Images Be careful of new software that can create so-called "deep fakes," in which videos of real people can be simulated, experts warn. Deep Nostalgia, released by the company MyHeritage, is trending on social media, with users reanimating everyone from famous composers to dead relatives. The software is drawing mixed reactions, with some people delighted by the creations, and others finding them creepy. The technology shows just how easy it is to create videos of people doing things they actually haven’t done in real life. "Deepfake technology is getting more sophisticated and more dangerous," Aaron Lawson, assistant director of SRI International’s Speech Technology and Research (STAR) Laboratory, said in an email interview. "This is partly due to the nature of artificial intelligence. Where 'traditional' technology requires human time and energy to improve, AI can learn from itself. "But AI’s ability to develop itself is a double-edged sword," Lawson continued. "If an AI is created to do something benevolent, great. But when an AI is designed for something malicious like deep fakes, the danger is unprecedented." Software Brings Photos to Life Genealogy website MyHeritage introduced the animation engine last month. The technology, known as Deep Nostalgia, lets users animate photos via the MyHeritage website. A company called D-ID designed algorithms for MyHeritage that digitally recreate the movement of human faces. The software applies the movements to photographs and modifies facial expressions to move as human faces usually do, according to the MyHeritage website. Deep Nostalgia shows that deep-fake technology is becoming more accessible, Lior Shamir, a professor of computer science at Kansas State University, said in an email interview. It’s progressing quickly and eliminating even the subtle differences between the fake and real video and audio. "There has also been substantial progress towards real-time deep fake, meaning that convincing deep fake videos are generated at the time of video communication," Shamir said. "For instance, one can have a Zoom meeting with a certain person, while seeing and hearing the voice of a completely different person." There are also a growing number of language-based deep fakes, Jason Corso, the director of the Stevens Institute for Artificial Intelligence at the Stevens Institute of Technology, said in an email interview. "Generating whole paragraphs of deep fake text toward a specific agenda is quite difficult, but modern advances in deep natural language processing are making it possible," he added. How to Detect a Deep Fake While deep-fake detection technology is still in a nascent stage, there are a few ways you can spot one, Corso said, starting with the mouth. "The variability in the appearance of the inside of the mouth when someone is speaking is very high, making it difficult to animate convincingly," explained Corso. "It can be done, but it is harder than the rest of the head. Notice how the Deep Nostalgia videos do not demonstrate an ability for the photograph to say 'I love you' or some other phrase during the deep fake creation. Doing so would require the opening and closing of the mouth, which is very difficult for deep fake generation." Ghosting is another giveaway, added Corso. If you see blurring around the edges of the head, that’s a result of "fast motion or limited pixels available in the source image. An ear could partially disappear momentarily, or hair could become blurry where you wouldn't expect it to," he said. You also can look out for color variation when trying to spot a deep fake video, such as a sharp line across the face, with darker colors on one side and lighter on the other. "Computer algorithms can often detect these patterns of distortion," said Shamir. "But deep fake algorithms are advancing rapidly. It is inevitable that strict laws will be required to protect from deep fake and the damage that they can easily cause." Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Subscribe Tell us why! Other Not enough details Hard to understand Submit