This new tool could stop the spread of deepfakes
Amber Authenticate could be the way to avoid the sinister spread of fake news
You’ve probably now heard of Deepfakes: videos created with the help of AI that combine elements of two or more videos. This has multiple uses – changing what someone is saying, putting someone head on someone else’s body, that kind of thing.
One such example was a Steve Buscemi/Jennifer Lawrence mash up – truly nightmare-inducing stuff.
But beyond the ‘creepy but basically harmless’, there are more sinister uses for Deepfakes. Being able to manipulate anyone into saying almost anything is obviously a concern in an age of fake news, and feminists have pointed out that the videos could be used to “harass and humiliate” women.
This, in fact, has actually already happened – when Deepfakes first came into the public eye, videos had already been circulating the internet with celebrity women’s heads placed on the bodies of porn stars. Finding a way to combat – or at least verify – such videos is therefore a pretty important task.
Now a new tool, Amber Authenticate, hopes to do just that.
The tool, WIRED explains, basically uses the blockchain to record ‘hashes’ – “cryptographically scrambled representations of the data” . If you’re worried that a video has been manipulated, you simply run it through the algorithm again; if the hashes are different, that means the audio or video data of the file has been changed, warning you that it may have been tampered with.
“Technologists are going to have to validate the security of Amber as with any authentication technique,” Jay Stanley, senior policy analyst at the American Civil Liberties Union told WIRED.
“But I hope that Amber or a similar product becomes standard. Like body cameras themselves, video authentication can help create community confidence in evidence about what’s taken place, and can give everybody confidence that things are on the up and up in what can be very harrowing and difficult incidents.”
(Image: Unsplash)