STE WILLIAMS

$50 DeepNude app undresses women with a single click

Deepfakes: the convincing images created by sophisticated neural networks – an evolving technology that first came to light in 2017 – threaten to undercut the veracity of everything, including apparent videos of CEOs or any other public figure who can be made into an actor for the sake of fake news, undermining trust and reliability and, um…

Oh, who are we kidding? It’s all about boobs.

Motherboard reported this week on a $50 app called DeepNude that automatically undresses a photo of any woman with a single click, swapping the clothes for breasts and a vulva. And even though it sounds like it would make for a lucrative scam to simply cook up the name DeepNude, sit back and watch those $50 charges rack up, this was for real.

After Motherboard’s exposure, the site was swept offline by a tidal wave of drool:

Following the controversy that erupted, the anonymous creator of this app, going by the name Alberto, claimed he shut down the app.

Despite the safety measures adopted (watermarks) if 500,000 use it the probability that people will misuse it will be too high. […] The world is not yet ready for DeepNude.

After all, Alberto is “not a voyeur”, he’s a “technology enthusiast”:

I’m not a voyeur, I’m a technology enthusiast. Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.

Nasty ramifications for revenge porn victims-to-be

Despite the shuttering of DeepNude – for now at least – similar services will likely be hot on its heels. Motherboard talked to Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, who found DeepNude “absolutely terrifying.”

Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.

Unfortunately for the portion of the public that doesn’t want to be unwillingly cast in nude photos, that horse left the barn long ago. The rinky-dink Photoshop precursors to deepfakes were more or less easy to spot. Deepfakes could still be spotted, at least by experts. But DeepNude kicks it up a notch… or two or three.

When Motherboard showed the DeepNude app and its fake nudes to Hany Farid, a computer-science professor at UC Berkeley and expert on the digital forensics of deepfakes, Farid told the publication he was shocked not only at the advances in deep fakery that its development demonstrates, but also how easy it makes it for anybody:

We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways.

In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. And, our legislators are going to have to think about how to thoughtfully regulate in this space.

Alberto told Motherboard that his software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses a family of dueling computer programs known as generative adversarial networks (GANs): machine learning systems that pit neural networks against each other in order to generate convincing photos of people who don’t exist.

Experts believe that GANs were used to create what an AP investigation recently suggested was a deepfake LinkedIn profile of a comely young woman who was suspiciously well-connected to people in power.

Forensic experts easily spotted 30-year-old “Katie Jones” as a deepfake. This was recent: that story was published earlier this month. Now, we have DeepNude, which appears to have advanced the technology all that much further, plus put it into an app that anybody can use to generate a deepfake within 30 seconds (a time that will decrease as development and resources ramp up, Alberto said).

DeepNude was trained on more than 10,000 nude photos of women, its creator said.

Even clumsily Photoshopped images are still an “invasion of sexual privacy,” experts say. Those words come from Danielle Citron, professor of law at the University of Maryland Carey School of Law. According to Motherboard, Citron recently testified to Congress about the deepfake threat. What she told the publication:

Yes, it isn’t your actual vagina, but … others think that they are seeing you naked. As a deepfake victim said to me – it felt like thousands saw her naked, she felt her body wasn’t her own anymore.

DeepNude is supported on Windows and Linux. It makes a feeble stab at protecting the privacy of the women it’s exploiting: both the free and the premium versions have watermarks “that cover the face,” “clearly marking that it is a fake,” according to the site – although it does admit that on the premium version, watermarks are “reduced”.

That tenuous saving grace would be shredded quite easily by removing the watermark or “FAKE” sticker with Photoshop.

Why not just use a backscatter X-ray device, such as the ones used in airports that see through clothing very well? At least back when the technology first came out, the images produced by the technology were called a “virtual striptease.”

Nice, but not portable, and not affordable.

Why not just use X-ray Specs?

True, this American novelty, which doesn’t actually employ X-rays, doesn’t actually see through clothing. Long advertised with the slogan “See the bones in your hand, see through clothes!”, they instead use slightly offset images to create a visual illusion to make viewers think they’re seeing past exteriors.

Plus, you can get a pair for under $10.

That’s quite a bargain when it comes to paying for illusions!

In fact, Alberto told Motherboard, X-ray Specs were his inspiration. He saw ads for the novelty glasses while browsing magazines from the 60s and 70s, he said.

Like everyone, I was fascinated by the idea that they could really exist, and this memory remained. About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that X-ray glasses are possible!

Eureka. Hallelujah… for one cash-hungry programmer and his eager clientele, that is. For the rest of us? Not so much.

Article source: http://feedproxy.google.com/~r/nakedsecurity/~3/KspqImFQfB4/

Comments are closed.