Google responded to congressional child online safety proposals with its own counteroffer for the first time Monday, urging lawmakers to drop problematic protections like age-verification tech.
In a blog post, Google released its “Legislative Framework to Protect Children and Teens Online.” The framework comes as more lawmakers, like Sen. Elizabeth Warren (D-MA), are lining up behind the Kids Online Safety Act, a controversial bill intended to protect kids from dangerous content online.
In the framework, Google rejects state and federal attempts at requiring platforms to verify the age of users, like forcing users to upload copies of their government IDs to access an online service. Some states have recently gone as far as passing laws requiring platforms to obtain parental consent before anyone under 18 is allowed to use their services. Google dismisses these consent laws, arguing that they bar vulnerable teens from accessing helpful information.
“As young people continuously evolve the way they show up online, so will our services and policies”
“Good legislative models — like those based on age-appropriate design principles — can help hold companies responsible for promoting safety and privacy, while enabling access to richer experiences for children and teens,” Kent Walker, Google’s president of global affairs, said in the blog post.
Lawmakers like KOSA author Sen. Richard Blumenthal (D-CT) and Sen. Ed Markey (D-MA) have called on tech companies to stop targeting ads to kids. In its framework, Google says platforms should ban the practice “for those under 18.”
Over the last year, state legislatures across the country have passed new laws regulating how kids under 18 can interact with the internet. Some states, like Louisiana, have tailored their bills to ban kids from seeing online porn by forcing everyone, including adults, to verify their age before using the site. Google’s proposal does not oppose age verification on porn and gambling sites.
YouTube published its own set of principles for protecting kids on Monday, laying out how the platform implements some of the guidance from Google’s policy framework. In a blog post, YouTube CEO Neal Mohan said the platform doesn’t serve personalized ads to kids and provides parents with a set of family controls.
“Families everywhere deserve the same safe, high-quality experience online, no matter where they live,” Mohan wrote. “And all children and teenagers ought to have the same access to the opportunities the internet provides. As young people continuously evolve the way they show up online, so will our services and policies.”