The creators of DeepNude, a desktop app that used artificial intelligence to of a clothed woman into a picture of her naked, have shut down the app and renounced using the software, a day after an article focused attention on the program.
“We don’t want to make money this way,” said a message posted on the app’s Twitter account, which still carries a bio describing the program as the “superpower you always wanted.”
“Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it,” the post continued. “The world is not yet ready for DeepNude.”
In addition, the program’s website returned a blank page with the text “not found.”
The app is the latest form of media manipulation to raise questions about privacy and consent as artificial intelligence gets better at creating fake photos and videos. Though computer manipulation of media has existed for decades, programs like DeepNude andare making the creation of sophisticated fakes easier for average people to do — and making forgeries harder to identify with the unaided eye.
Saying that the app was originally created as entertainment, DeepNude’s post discouraged use of the program and said that downloading the software from other sources or sharing it would violate its terms. The post also said DeepNude won’t be released in other versions, and that nobody — including people who hold a license for a premium version — has permission to use it. (It’s unclear how or if DeepNude can enforce those terms; the creators weren’t immediately reachable for comment.)
DeepNude, which launched as a downloadable Windows and Linux application on June 23, was the subject of a Vice article Wednesday.