For the past year and a half, the UK government has been promising a solution to the problem of underage people accidentally or even intentionally watching porn on the internet: a nationwide age verification system that would require would-be porn consumers to demonstrate their adulthood before accessing adult content.
The proposal — more formally known as the Digital Economy Act of 2017 — is not the first time the UK has attempted a nationwide solution to the problem of underage porn access. A few years ago, the country attempted to block adult sites at the ISP level, requiring users to opt in to access adult content. Unfortunately, that block was not as seamless as originally intended; among other issues, it ended up blocking far more than just porn.
figuring out what is and isn’t pornography is rarely a simple matter
The Digital Economy Act was intended to be a more effective method of blocking adult sites that puts the onus on pornographers, who must comply with the policy or risk getting banned by major payment processors. Yet as simple and appealing as that proposal may have once sounded, it’s proven significantly more thorny than the British Board of Film Classification (BBFC), the agency responsible for enforcing the policy, originally anticipated.
Earlier this month, the age verification system’s rollout — which was initially scheduled for this past spring — was postponed yet again, as lawmakers continued to grapple with the complexities of creating a system that adequately protects children from porn without putting excessive restrictions on other sex-related content.
To anyone in the adult industry or other sex-related fields, the UK government’s struggle to get its age verification system up and running doesn’t come as a surprise. Since the dawn of the internet, tech companies have promised nervous parents an easy way to protect their children’s innocence that’ll block access to anything that’s too risqué while allowing them to freely browse the safer side of the internet. More often than not, however, these porn blockers aren’t nearly as easy or as effective as advertised, largely because figuring out what is and is not pornography is rarely a simple matter.
Apple’s recently launched filters offer an instructive example. The parental controls feature, which debuted earlier this year, produced results that were inconsistent at best. While testing out the feature, staffers at sex education site O.School discovered that innocent items like a search for a dulce de leche recipe wound up blocked, as did numerous sex education sites. Teen Vogue’s entire site wound up blocked, as did sites offering assistance to queer youth.
“For a filter to be effective, it’s generally better, business-wise, to filter more content than less.”
How does something like that end up happening? “The issue with filters starts with the fact that their purpose is to block content,” says Michael Stabile, spokesperson for adult industry advocacy group the Free Speech Coalition. Stabile notes that parents who employ porn filters are far more likely to complain when something obscene isn’t filtered than when something innocent is, making it vastly easier for filtering companies to block as much as possible and then sort out the details later. As a result, “for a filter to be effective, it’s generally better, business-wise, to filter more content than less,” Stabile says.
In their attempts to keep all porn away from children’s innocent eyes, filtering software often ends up shutting off access to other, more PG-rated content, as well — either because, in the eyes of an algorithm, there’s no good way to differentiate between sexual entertainment or sexual education or because a popular porn search term also happens to be an innocent word used by non-pornographic sites in other contexts. (Stabile suspects the latter effect is what happened with Teen Vogue and dulce de leche. In addition to meaning “milk,” leche is sometimes used as Spanish slang for “semen.”)
Since the Digital Economy Act passed last year, the British government has been trying to piece together a system that would avoid these problems, working with the adult industry to figure out a system that will meet everyone’s needs. But as the repeated delays have shown, that task is far easier said than done. Some sex educators worry that even if the UK does manage to put together a system that works exactly as advertised — blocking all adult content, and only adult content — they may end up doing more harm than good.
“The more we block them from [good information about sex], the more we create a sense of forbidden fruit.”
Deb Hauser, president of sex ed nonprofit Advocates for Youth, is sympathetic to the parental urge to block all access to porn, but she considers it to be a bit misguided. “Inevitably in life, someone is going to show [young people] porn,” no matter how hard parents or governments work to make it inaccessible, she tells The Verge. Apple’s filters, for example, don’t block all porn sites; Reddit’s porn subreddit was still accessible in the earliest iteration of the software. (In the UK, the government admits that it may eventually require social media sites to register with its age verification platform, as well, a proposal that would be complicated to implement, to say the least.)
“Sexual development is normal and healthy, and youth sexual development needs to be supported with good information,” Hauser says, explaining that “the more we block them from that, the more we create a sense of forbidden fruit.” Rather than protecting young people from bad information, censorship frequently leads them to seek out any information they can find, increasing the chances that when they come across inaccurate or potentially even harmful media, they’ll be more likely to take it as fact.
For Hauser, the question is less “how can we prevent children from ever seeing porn?” and more “how can we prepare children to understand that the porn they may one day stumble across — either accidentally or on purpose — is a fantasy, and not an accurate depiction of the sum total of sex?”
Hauser points to a video produced for AMAZE, a youth sex education site Advocates for Youth created in partnership with Answer and Youth Tech Health. Rather than shaming young people for their natural sexual curiosity or presenting porn as some dastardly evil, the video acknowledges that, like many kinds of movies and media, porn is an exaggerated, often unrealistic depiction — a fantasy intended to be enjoyed by adults rather than an instruction manual for young people.
“How can we prepare children to understand that porn is a fantasy, and not an accurate depiction of the sum total of sex?”
“A good sexuality education program will talk media literacy,” Hauser explains, noting that preparing young people to be “better critical thinkers” with regards to both porn and other media messages is an essential part of developing a healthy sexual mindset.
Hauser also encourages parents to create an environment where their kids know that it’s normal — and natural — to be curious and inquisitive about sex, and to make sure they’re aware that there’s always an informed source that they can turn to when they have a question about sex (whether that’s a parent, another trusted adult, or a quality sex education resource).
“If you’re having good discussions at home, and if there are resources available to you… there isn’t as much of a compulsion [to seek out potentially bad information],” she says. When kids grow up informed and educated about sex, there’s no need to turn to porn to learn more about it. And when they do run into porn, they’re better prepared to recognize it as a fantasy rather than a reality.
Of course, having these conversations requires a lot of work that’s often uncomfortable, awkward, and embarrassing. It’s a great deal easier to pretend we can just shield young people from sexual content and never have to confront our own discomfort. That may explain why, even after months of delay, the UK is still determined to get its age verification system off the ground.