Reddit is home to all sorts of interesting folks. One in particular, as Motherboard discovered, passes his days Face-swapping celebrity faces onto porn actors’ bodies. I guess everyone needs a hobby. The redditor, who goes by "deepfakes," has an impressive (if you can call it that) portfolio of convincing swaps, including the likes of Gal Gadot, Maisie Williams, and Taylor Swift. Look what you made her do, though! Apparently, the dude (I’m betting it’s a man) uses a machine-learning algorithm to make his swaps come to life.
Deepfakes is riding the wave of a booming trend: AI-assisted fake porn. The technological advances in machine learning mean there’s a lot of convincing fake celebrity porn out there, and apparently there's an audience for it. Deepfakes created a subreddit dedicated to his hobby, and it has over 15,000 subscribers. On the platform, deepfake has become synonymous with the fake videos themselves.
Another redditor named “deepfakeapp" created FakeApp, an easy-to-use application that allows users to create fake porn with their own datasets. Deepfakeapp told Motherboard, “I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks.” He also said, “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.” Umm… WHY!? What kind of Black Mirror shit is this?
Obviously, this trend has alarming implications. Celebrities’ social media accounts have become hunting grounds for content that can then be used in fake porn. (Mind you, this could just as easily happen to a normal person, too.) And while most of the posts in r/deepfakes are porn, the same technology can be used in any type of video. Humanity was definitely not prepared for this.