Google came under fire this week after its new Photos app categorized photos in one of the most racist ways possible. On June 28th, computer programmer Jacky Alciné found that the feature kept tagging pictures of him and his girlfriend as "gorillas." He tweeted at Google asking what kind of sample images the company had used that would allow such a terrible mistake to happen.
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4— diri noir avec banan (@jackyalcine) June 29, 2015
Google’s chief social architect Yonatan Zunger responded quickly, apologizing for the feature.
@jackyalcine Holy fuck. G+ CA here. No, this is not how you determine someone's target market. This is 100% Not OK.— Yonatan Zunger (@yonatanzunger) June 29, 2015
As part of the new Photos app, users’ photos can be tagged and arranged automatically based on objects in the photos. For instance, pictures of skyscrapers are all arranged in an album appropriately labeled "skyscrapers." The tag feature learns as it receives more data, refining its method for recognizing and categorizing objects. The feature is flawed, but categorization isn’t the only step in photography that companies have had trouble with.
For years, Kodak used a coating on its film that favored Caucasian skin tones, making it more difficult to shoot darker skin. Nikon and other consumer camera companies have also had a history of showing bias to white faces with their facial recognition software. Zunger says that Google has had similar issues with facial recognition due to inadequate analysis of skin tones and lighting. "We used to have a problem with people (of all races) being tagged as dogs," Zunger says.
Google attempted to fix the algorithm, but ultimately removed the gorilla label altogether. Zunger noted that the company is working on longer-term fixes that revolve around which labels could be problematic and better recognition of dark-skinned faces.