Skip to main content

    Google grant will help computers detect gender balance and stereotyping in movies and TV

    Google grant will help computers detect gender balance and stereotyping in movies and TV

    Share this story

    If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

    Google has funded plenty of research and humanitarian projects, and now it's putting some of its money towards figuring out how to automate sociological research. As part of its $23 million in new Global Impact Awards, Google announced $1.2 million in funding for the Geena Davis Institute on Gender in Media. The eight-year-old Institute takes a research-based approach towards increasing representation and decreasing stereotyping of women and girls in film and TV, building a massive database of how many female characters appear in scenes, how often they speak, and how often they are sexualized compared to men. As one might suspect, the results aren't great for women and girls — the Institute concludes that in family films, for example, there are about three male characters for every female one, and women are less likely overall to be shown working or even given a speaking role.

    "Only by having the facts can we put a spotlight on how females are portrayed."

    Right now, though, surveys are limited by the need to manually count scenes or characters, something the Institute hopes to change. Google's grant will go towards creating an automation tool that can analyze movie demographics the same way programs can currently match an image or tell what someone is doing in a video. Not only would this free up time and money for other work, Google and the Institute say it could let research scale dramatically, letting users check a fuller set of data (the current sample set is limited to the highest-grossing movies) or perform research outside the US media landscape.

    Geena Davis Institute executive director Madeline Di Nonno has told Wired that the relatively small grant is still "extremely substantial" for the program, and that "only by having the facts can we put a spotlight on how females are portrayed." The hope is that by highlighting that portrayal, the Institute can motivate people to change it, both helping women in media and giving children positive female role models (or just female characters who aren't lone tokens.) An automated system could act as a kind of mass Bechdel test, pointing out overall gender bias. As the gaming community has recently discussed, there's no one reason for the gender imbalance found in many fields, and Google is already working with mixed results on expanding its own female workforce. But trying to tip the balance away from portraying women mostly as love interests, mothers, or nonentities certainly can't hurt.