Life
Life beyond style

The views and opinions expressed in this piece are those solely of the author, and do not necessarily reflect the position of Highsnobiety as a whole.

There was a time in a not too distant past when a person had to use something like the Sears catalog as the inspiration to masturbate with; sifting passed the power tools, and before the pajama section, until the lacy frills of the lingerie selection appeared. Shows like Netflix’s Big Mouth – which hilariously captured the essence of puberty – have made it clear that anything from the shape of a tomato, to the hint of cleavage on the Land O’Lakes cartoon logo, can be harnessed for sexual purposes if the rubber/tugger has a vibrant enough imagination.

Even as the Internet took hold, the terrible dial-up speeds on modems were hardly capable of painting a complete picture of the explicit photos and videos that people wanted to see. But then speeds increased – expediting not only playback, but also the finish – and people haven’t looked back ever since.

Now, possibly more than ever, porn has become as acceptable as marijuana usage. It may not be for everyone. But if a person wants to indulge, a combination of science, technology and ingenuity have created different types to satiate the desires of users.

What are Deepfakes?

this is NOT Daisy Ridley

However, the emergence of a new type of porn, Deepfake videos, suggests that we may finally have gone too far – creating a hybrid between Black Mirror, Hollywood and IRL crushes.

In a world already buzzing with “fake news,” Deepfake videos are AI-generated clips which were conceived by a Reddit user, deepfake, who used open source libraries (TensorFlow), Google image search, social media websites like Instagram, stock photos, and YouTube videos to create a machine-learning algorithm which allowed him to insert famous people’s faces onto pre-existing videos frame by frame.

While the less harmful version involves President Donald Trump’s mug being inserted on Biff Tannen’s face in Back to the Future, Nic Cage being put into films that he wasn’t actually in, and how filmmakers worked around Paul Walker’s death in Furious 7, many in the subreddit realized that it could be used for pornographic purposes as well. Not surprisingly, as many of Hollywood’s leading ladies began being “Deepfaked,” subscribers swelled to 80,000 in only months, and deepfake’s moniker became to video manipulation what Google is to search.

Although there are glitches, the videos are quite believable and are only getting more convincing as other people with programming backgrounds have experimented with browser-based applications that best match a face with a body. deepfake himself created FakeApp for those technologically challenged who still wanted to participate. The result; Deepfake videos of actresses like Scarlett Johansson, Gal Gadot, Daisy Ridley, Jessica Alba, Maisie Williams, Taylor Swift, and Emma Watson. And not surprisingly, these videos are marketed as “hacked” or “stolen” which is an effective ruse given how in the past real nude photos of actresses appeared in a Reddit event called, “The Fappening.”

If and inevitably when these deepfake videos have another layer of technology-assisted fakery added – like Adobe’s pilot projects— VoCo and Cloak — the first a “Photoshop for audio,” which can take 20 minutes of audio and transform that into a brand new conversation of the users construction, it will only further blur the lines between real and fake.

How Deepfakes could impact you

#lanadelrey #deepfakes #deepfake #aifake

A post shared by Tonguekissingishot (@tonguekissingishot) on

Some may charge that the celebrity nude fake has been around for decades. And while that’s true, there’s something deeply concerning about how convincing the videos are. But it may not ultimately be celebrities who are impacted the most.

In 2016, 24 billion people uploaded selfies to Google. The average Facebook user has 338 friends who help push the company to receive 350 million photos a day. Needles to say, most people have some type of digital fingerprint – whether that be a Linkedin account profile pic, or vacation photos you’ve shared with “friends” on Facebook.

With Deepfake videos, there’s now the dangerous precedent being set that anyone and everyone could have their likeness inserted into a pornographic scenario. For some, this might only be a way to live out a sexual fantasy that will never happen. For others, it could be used as an attack – like revenge porn we’ve seen in the past on shuttered sites like IsAnyoneUp? – in an attempt to damage the lives and careers of former lovers.

Prior to the emergence of this technology, already 1 in every 25 Americans (10 million people) either were threatened with, or victims of nonconsensual, explicit. image sharing.

That number swells to 1 in every 14 (7 percent) for young people ages 15-29, 1 in 10 for women under the age of 30, and tops out at 17% for LGB Americans who have either had an image shared without their consent or have had someone threaten to share an image of them.

Currently, only 34 states and Washington DC have laws expressly applicable to revenge porn. And since Deepfakes are such a new phenomenon, people are having a hard time classifying what if any laws are being broken. Yes, they are using a person’s likeness, but the pornographic aspects of the videos are of performers who consented to being videotaped.

Additionally, the videos themselves could exist in the murky aspects of the First Amendment where parody, satire, and caricatures are all protected and hard to define. Some have even speculated that Deepfake creators could hide behind an alibi that what they’re doing is considered an erotic “art project.”

As Wired noted, “You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.”

Major content hubs like Reddit, Twitter and PornHub have all condemned the practice, and the subreddit where it started has since been shuttered. But as of writing this piece, a Pornhub search still revealed 47 different Deepfake videos – some of which have the addendum “NOT” in from of a celebrities name as a quasi “out.”

While Hollywood publicists – who job it is to literally curate a star’s image – with be committed enough to see that these videos are expunged, what about those who will inevitably end up on a porn site who are just regular citizens? What are they supposed to do? Is it enough to simply tell a porn webmaster, “hey, that’s not me?” And if it isn’t, how will a person go about proving that a particular video has been faked?

We’re all fucked…quite literally.

In related news, read about PornHub’s decision to release their premium content for free on Valentine’s Day.

  • Featured/Main Image: Stephen Cheetham
Words by Alec Banks
Features Editor

Alec Banks is a Los Angeles-based long-form writer with over a decade of experience covering fashion, music, sports, and culture.

What To Read Next