OpenAI is letting users of its AI art generator program DALL-E edit images with human faces. This feature was previously off-limits due to fears of misuse, but, in a letter sent to DALL-E’s million-plus users, OpenAI says it’s opening up access after improving its filters to remove images that contain “sexual, political, and violent content.”
The feature will let users edit images in a number of different ways. They can upload a photograph of someone and generate variations of the picture, for example, or they can edit specific features, like changing someone’s clothing or hairstyle. The feature will no doubt be useful to many users in creative industries, from photographers to filmmakers.
“With improvements in our safety system, DALL·E is now ready to support these delightful and important use cases — while minimizing the potential of harm from deepfakes,” said OpenAI in its letter to customers announcing the news.
The decision is part of an ongoing negotiation by the makers of AI art generators with their own users as they try to navigate the technology’s potential harms. As a well-funded company with links to tech giants like Microsoft, OpenAI has taken a relatively cautious approach. But the company has been outflanked by rivals like Stable Diffusion, which places fewer restraints on users. This leads to quicker development of the technology, but also makes malicious applications far easier. Stable Diffusion, for example, is already being used to generate pornographic deepfakes of celebrities.