This week, NBC reported that facial recognition researchers at companies like IBM often feed their algorithms photos from publicly available collections, only protected by a Creative Commons license, without requesting permission from the people who are photographed. The incident raised the question of whether or not such training could be considered a valid use under the Creative Commons licenses.
It looks like the answer is yes — but Creative Commons also argues that today’s copyright laws may be insufficient to protect your face from being scanned, regardless of whether the photographer is using a permissive Creative Commons license or simply reserving all their rights to the photo. And that’s before we consider that the photographer, not the person photographed, is the one with copyright over a photo.
“Copyright is not a good tool to protect individual privacy,” Creative Commons CEO Ryan Merkley writes in a blog post. Here’s a larger quote which explains CC’s current perspective on the matter:
CC licenses were designed to address a specific constraint, which they do very well: unlocking restrictive copyright. But copyright is not a good tool to protect individual privacy, to address research ethics in AI development, or to regulate the use of surveillance tools employed online. Those issues rightly belong in the public policy space, and good solutions will consider both the law and the community norms of CC licenses and content shared online in general.
According to the NBC report from earlier this week, IBM took nearly a million photos from Flickr to train facial recognition programs. Many Flickr photographers told NBC that they and the people photographed had not been made aware that their images were being fed to facial recognition algorithms. “None of the people I photographed had any idea their images were being used in this way,” one person told NBC. “It seems a little sketchy that IBM can use these pictures without saying anything to anybody.”
In a statement, IBM told The Verge that it takes “the privacy of individuals very seriously and have taken great care to comply with privacy principles, including limiting the Diversity in Faces dataset to publicly available image annotations and limiting the access of the dataset to verified researchers.”
Creative Commons says that while it doesn’t have all the facts about the IBM data set, it’s “aware that fair use allows all types of content to be used freely,” in that the Fair Use exceptions to copyright law may allow researchers to use these photos regardless of its licenses. The organization has added an entire new section to its frequently asked questions page that answered a number of our burning questions about how AI use may or may not violate the terms of a CC license.
But one of those answers does suggest that Creative Commons intends to stay agnostic when it comes to new uses of photos, for better or for worse: “This is one of the enduring qualities of our licenses — they have been carefully designed to work with all new technologies where copyright comes into play. No special or explicit permission regarding new technologies from a copyright perspective is required.”
That essentially means that even if someone invents a brand-new technology tomorrow, many Creative Commons photos would still be fair game (if they weren’t already fair game because of fair use). And people who feel that their photos have been misused don’t have much legal recourse unless public outcry turns into regulation of facial recognition — a new bill was introduced in the US Senate just today. In the case of IBM, photographers can request that their photos be removed if they can verify that they were used in the first place. Helpfully, NBC has a tool within its story where photographers can check.
Correction, 6:08 PM ET: Added better explanation of CC’s stance, and removed conflation between fair use (as a copyright exemption) and permitted uses under a Creative Commons license. We regret the confusion. Also added that a new bill was introduced in the US Senate to potentially stop companies from this practice.