Prisma Labs, maker of Lensa AI, says it is working to prevent accidental generation of nudes
We recently uncovered that Lensa AI can be tricked into creating NSFW images. When TechCrunch made the Prisma team aware of its findings, the company’s CEO replied with its findings.
Usoltsev also shared some additional context for why Lensa ended up generating NSFW images, explaining that this is a result of the underlying technology, Stability AI, is doing what it is told, but only in a sandbox environment.
“Stable Diffusion neural network is running behind the avatar generation process,” says Usoltsev. “Stability AI, the creators of the model, trained it on a sizable set of unfiltered data from across the internet. Neither us, nor Stability AI could consciously apply any representation biases; To be more precise, the man-made unfiltered data sourced online introduced the model to the existing biases of humankind. The creators acknowledge the possibility of societal biases. So do we.”
The Stability AI model includes adaptations of the Stable Diffusion model software to make it harder for users to generate nude and pornographic imagery since the end of November, 2022, and Prisma AI’s founder assures me that these adaptations can be outmanouvered by savvy users.
We are in the process of building the NSFW filter. It will effectively blur any images detected as such. Andrey Usoltsev, CEO at Prisma Lab
“We specify that the product is not intended for minors and warn users about the potential content. We also abstain from using such images in our promotional materials,” Usoltsev told us. “To enhance the work of Lensa, we are in the process of building the NSFW filter. It will effectively blur any images detected as such. It will remain at the user’s sole discretion if they wish to open or save such imagery.”
The Prisma Labs team points out that there are two issues here; by uploading explicit images, the users train a particular and individual copy of the model, that the company claims is deleted once the generation is complete, and that these images cannot be used to train the model further. In other words: If you upload porn to make more porn, that’s kind of on you.
“There’s no doubt that a wider conversation around AI use and regulations needs to take place in the near future and we’re keen to be a part of it. We also provide all necessary guidance and appropriate warnings to enable the best experience of the Magic Avatars feature,” says Usoltsev. “But if an individual is determined to engage in harmful behavior, any tool would have the potential to become a weapon.”
The company didn’t share whether it has plans in place to avoid the creation of so-called ‘deepfake’ nude imagery.
In the meantime, I guess I get to enjoy consensually-generated photos of myself looking better than I ever have in any photo, and encourage others to obtain consent before they create porn of others.