Artificial intelligence experts release a mysterious video by Richard Nixon “announcing the failure of the 1969 Apollo 11 moon landing” to highlight the dangers of manipulated images
- MIT created the deepfake video to show the dangers that technology creates
- The speech prepared for Nixon was read by an actor and transformed into old images
- AI technology adapted audio and video to look compelling
A realistic and terrifying fake video shows what it would have looked like if President Richard Nixon had been forced to deliver a grim speech to the world if the Apollo 11 mission had ended in disaster.
It is well known that the American President had two speeches prepared, one in the event of a safe landing and the other in the event of a tragedy.
Fortunately, the landing on July 20, 1969 by Neil Armstrong and Buzz Aldrin was a resounding success, making the latter redundant.
However, experts from the Massachusetts Institute of Technology (MIT) have created a completely artificial video that shows how it looked and sounded.
It is part of a project called ‘Moon Disaster’ and is designed to draw attention to the risk that deep counterfeits pose and how they can manipulate people and spread fake news.
A terrifying and realistic deepfake video shows what it would have looked like if President Richard Nixon had been forced to deliver a grim speech to the world if the Apollo 11 mission had ended in disaster (pictured, the deepfake video when Nixon begins his speech)
Deepfakes are fake videos that look realistic and have already created problems in the real world.
Last year, a viral video of Nancy Pelosi dragging the words and appearing drunk circulated on social networks. However, it was manipulated and not a legitimate representation of a speech he gave.
While this was not technically a deep lie, it did show the power of fake online videos.
Other deep fakes have emerged since Pelosi’s infamous video, including Mark Zuckerberg on privacy and various attacks on Donald Trump.
As a result, experts have been debating how to curb the spread of misleading videos that have the potential to ruin reputation and endanger the public.
The MIT campaign used the 51st anniversary of the moon landing to rekindle the conversation.
The video comes with a clear disclaimer that says “what you are about to see is not real.”
Lasting over seven minutes, it is a well-edited film that combines deep learning, acting, and genuine NASA imagery.
The video comes with a clear disclaimer that says “what you are about to see is not real.” Lasts over seven minutes and is a well-edited video tape that combines deep learning, acting, and genuine NASA imagery (pictured, a used shot of genuine NASA imagery from the Apollo 11 mission)
At the end of the fake broadcast, the video shows another warning that the video is not legitimate. The titles on the screen say explain how the video was created and end with ‘verify your sources’
After appearing to show that the Apollo 11 mission failed catastrophically, he interrupts Richard Nixon, who begins his speech.
If this had happened in real life, it would have been one of the saddest broadcasts of all time.
The first line is: “Fate has ordained that men who went to the moon to explore in peace will remain on the moon to rest in peace.”
The full speech is available from the US National Archives and was read aloud by an actor.
The President’s physical facial movements and voice were made compelling through AI-powered ‘deep learning’ technology.
It took a complete team to produce the video, and the seven-minute clip took several months to create.
The results of the team of experts at MIT are sensational and look extremely realistic, unlike some deep fakes that have emerged recently.
The video was first shown in a physical art installation at MIT late last year and is now publicly available on YouTube.
At the end of the fake broadcast, the video shows another warning that the video is not legitimate.
The titles on the screen say: ‘This project shows the dangers of disinformation.
‘Deepfakes uses artificial intelligence to simulate that people do and say things they never did, usually without their consent.
‘The project used a variety of techniques, from replacing video dialogue and voice conversion systems to more traditional video editing techniques to demonstrate the range of misinformation that is possible.
“Check your sources.”
.