Lensa has been climbing the app store hit lists with its avatar-generating AI that is making artists wave the red flag. Now, there’s another reason to fly the flag, as it turns out it’s possible — and way too easy — to use the platform to generate non-consensual soft porn.
TechCrunch has seen photo sets generated with the Lensa app, which includes images with breasts and nipples clearly visible in the images with faces of recognizable people. It seemed like the kind of thing that shouldn’t have been possible, so we decided to try it ourselves. To verify that Lensa will create the images it perhaps shouldn’t, we created two sets of Lensa avatars:
One set, based on 15 photos of a well-known actor.
Another set, based on the same 15 photos, but with an additional set of 5 photos added of the same actor’s face, photoshopped onto topless models.
The first set of images was in line with the AI avatars we’ve seen Lensa generate in the past. The second set, however, was a lot spicier than we were expecting. It turns out the AI takes those Photoshopped images as permission to go wild, and it appears it disables an NSFW filter. Out of the 100-image set, 11 were topless photos of higher quality (or, at least with higher stylistic consistency) than the poorly done edited topless photos the AI was given as input.
Generating saucy images of celebrities is one thing, and as illustrated by the source images we were able to find, there has long been people on the internet who are willing to collage some images together in Photoshop. Just because it’s common doesn’t make it right — in point of fact, celebrities absolutely deserve their privacy and should definitely not be made victims of non-consensual sexualized depictions. But so far, getting those to look realistic takes a lot of skill with photo editing tools along with hours, if not days, of work.
The big turning point, and the ethical nightmare, is the ease with which you can create near-photorealistic AI-generated art images by the hundreds without any tools other than a smartphone, an app, and a few dollars.
The ease with which you can create images of anyone you can imagine (or, at least, anyone you have a handful of photos of), is terrifying. Adding NSFW content into the mix, and we are careening into some pretty murky territory very quickly: your friends or some random person you met in a bar and exchanged Facebook friend status with, may not have given consent to someone generating softcore porn of them.
It appears that if you have 10-15 ‘real’ photos of a person and are willing to take the time to photoshop a handful of fakes, Lensa will gladly churn out a number of problematic images.
AI art generators are already churning out pornography by the thousands of images, exemplified by the likes of Unstable Diffusion and others. These platforms, and the unfettered proliferation of other so-called ‘deepfake’ platforms are turning into an ethical nightmare, are prompting the UK government to push for laws criminalizing the dissemination of non-consensual nude photos. This seems like a very good idea, but the internet is a hard-to-govern place at the best of times, and we’re collectively facing a wall of legal, moral, and ethical quandaries.
We reached out to Prisma Labs, who make the Lensa AI for a comment, and will update the story when we hear back from them.
It’s way too easy to trick Lensa AI into making NSFW images by Haje Jan Kamps originally published on TechCrunch