Face-swap porn is now a thing and it's truly terrifying
Don't remember shooting that sex tape you're now in? This is why
Faceswaps are a good laugh aren’t they? All those viral football videos of Anchorman scenes with players’ heads stuck on top like a lollipop to make some banterous point about an upcoming game? Everyone loves those don’t they. Hey, we’ve even been known to do one or two ourselves:
But have you ever stopped to consider the logical end-game of this sort of activity?
What if, with the increasing sophistication of Artificial Intelligence, the technology enabled you not to just stick a rudimentary head over someone’s else’s, but to fully swap out a face in a video for another person’s? And then, if you could also manipulate that face and dub over audio?
Well, then you could make anyone say and do anything, simply by mapping their face over an actor’s and manipulating the audio. Forget today’s ‘fake news’, you could create something on an altogether different level.
Inevitably, one day this technology will exist, but it won’t be for a while, surely; after all, this stuff must be seriously difficult to do accurately.
Well think on, because this video, released in July last year demonstrated that it was possible to map face movements and speech from one video to another - albeit with both videos featuring the same voice and person - in this case, former US president Barack Obama.
But now, according to a fascinating new investigation by Motherboard, the day of faceswap reckoning has already arrived and, inevitably, it’s started where the internet always starts: pornography.
In December, they found a redditor named ‘deepfakes’ who was already spending their time face-swapping celebrity faces onto the bodies of porn performers, using a machine learning algorithm and nothing more than his home computer.
However, in the short time since that article was published, “the practice of producing AI-assisted fake porn has exploded… and the results have become increasingly convincing”.
Fellow redditors have been busy refining techniques and making them available to those without a computer science background, providing apps and walk-through instructions which enable civilians to create AI-assisted fake porn.
A user named ‘deepfakeapp’ has, as the name suggests, created one of these apps and explained:
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks. Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
They also do not take an inordinate amount of time to create: according to Motherboard, “Running the entire process, from data extraction to frame-by-frame conversion of one face onto another, would take about eight to 12 hours if done correctly.”
You can see for yourself how sophisticated the results are, with these clips of the faces of Daisy Ridley, Jessica Alba and Gal Gadot superimposed on the bodies of porn stars:
Of course, at the moment, while this is undoubtedly upsetting for the subjects of these videos, at least it’s fairly easy to spot that they are not real; there is overwhelming plausible deniability from those seemingly involved. But the time cannot be far away when a ‘sex tape’ of a celebrity surfaces, with enough believable features in the setting and the timing that a denial would not be enough to stop people believing that it was real.
And this phenomenon is not going to confine itself to porn. Given the effects of Trump’s persistent lying and the ability of social media to amplify fake media, it is surely only a matter of time before words - and actions - are literally put into the mouths of people in positions of power.
In fact, a video already exists which sees a face swap of the legendary internet obsession, Downfall, with Bruno Ganz’s face combined with Argentina’s president Mauricio Macri:
Imagine the moment when a power-hungry politican is offered a video of their rival in an apparently compromising situation; a video which could take down their opponent, or at the very least cast aspersions on their character which they would never be able to shake off - would they really be able to resist?
As Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering says:
“You could argue that what’s new is the degree to which it can be done, or the believability, we’re getting to the point where we can’t distinguish what’s real - but then, we didn’t before,” she said. “What is new is the fact that it’s now available to everybody, or will be… It’s destabilizing. The whole business of trust and reliability is undermined by this stuff.”
No smoke without fire? Not for much longer it seems.