Skip to main content

YouTube removes video that tests Tesla’s Full Self-Driving beta against real kids

YouTube removes video that tests Tesla’s Full Self-Driving beta against real kids

/

The video violates YouTube’s guidelines against content that endangers minors

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Photo by Sean O’Kane / The Verge

YouTube has removed a video that shows Tesla drivers carrying out their own safety tests to determine whether the EV’s (electric vehicle) Full Self-Driving (FSD) capabilities would make it automatically stop for children walking across or standing in the road, as first reported by CNBC.

The video, titled “Does Tesla Full-Self Driving Beta really run over kids?” was originally posted on Whole Mars Catalog’s YouTube channel and involves Tesla owner and investor, Tad Park, testing Tesla’s FSD feature with his own kids. During the video, Park drives a Tesla Model 3 toward one of his children standing in the road, and then tries again with his other kid crossing the street. The vehicle stops before reaching the children both times.

As outlined on its support page, YouTube has specific rules against content that “endangers the emotional and physical well-being of minors,” including “ dangerous stunts, dares, or pranks.” YouTube spokesperson Ivy Choi told The Verge that the video violated its policies against harmful and dangerous content, and that the platform “doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities.” Choi says YouTube decided to remove the video as a result.

“I’ve tried FSD beta before, and I’d trust my kids’ life with them,” Park says during the now-removed video. “So I’m very confident that it’s going to detect my kids, and I’m also in control of the wheel so I can brake at any time,” Park told CNBC that the car was never traveling more than eight miles an hour, and “made sure the car recognized the kid.”

As of August 18th, the video had over 60,000 views on YouTube. The video was also posted to Twitter and still remains available to watch. The Verge reached out to Twitter to see if it has any plans to take it down but didn’t immediately hear back.

The crazy idea to test FSD with real — living and breathing — children emerged after a video and ad campaign posted to Twitter showed Tesla vehicles seemingly failing to detect and colliding with child-sized dummies placed in front of the vehicle. Tesla fans weren’t buying it, sparking a debate about the limitations of the feature on Twitter. Whole Mars Catalog, an EV-driven Twitter and YouTube channel run by Tesla investor Omar Qazi, later hinted at creating a video involving real children in an attempt to prove the original results wrong.

In response to the video, the National Highway Traffic Safety Administration (NHTSA) issued a statement warning against using children to test automated driving technology. “No one should risk their life, or the life of anyone else, to test the performance of vehicle technology,” the agency told Bloomberg. “Consumers should never attempt to create their own test scenarios or use real people, and especially children, to test the performance of vehicle technology.”

Tesla’s FSD software doesn’t make a vehicle fully autonomous. It’s available to Tesla drivers for an additional $12,000 (or $199 / month subscription). Once Tesla determines that a driver meets a certain safety score, it unlocks access to the FSD beta, enabling drivers to input a destination and have the vehicle drive there using Autopilot, the vehicle’s advanced driver assistance system (ADAS). Drivers must still keep their hands on the wheel and be ready to take control at any time.

Earlier this month, the California DMV accused Tesla of making false claims about Autopilot and FSD. The agency alleges the names of both features, as well as Tesla’s description of them, wrongly imply that they enable vehicles to operate autonomously.

In June, the NHTSA released data about driver-assist crashes for the first time, and found that Tesla vehicles using Autopilot vehicles were involved in 273 crashes from July 20th, 2021 to May 21st, 2022. The NHTSA is currently investigating a number of incidents where Tesla vehicles using driver-assist technology collided with parked emergency vehicles, in addition to over two dozen Tesla crashes, some of which have been fatal.

Update August 20th, 2:10PM ET: Updated to add a statement and additional context from a YouTube spokesperson.