Nvidia thinks it can use autonomous driving sensors to make manual driving safer, and that’s very clever. It’s called AI Co-Pilot and Nvidia unveiled it at CES as part of an extensive set of upgrades it made to its autonomous vehicle products.
Self-driving cars will, eventually, save many lives. But, it could be decades before autonomous cars can drive in every situation. Until then, there will be times when we must drive ourselves, because there are lots of situations where autonomous cars still aren’t good enough. Anything from bad weather and construction zones can flummox them.
But Nvidia believes we can use all the sensors and computers to make human driving safer. AI Co-Pilot uses external cameras and radar sensors to tell you if a bicyclist is riding next to you as you’re about to make a turn, or if a pedestrian has stepped into the road. It has cameras inside the car that can watch your face and see where you’re looking, what your mood is (hey, you’re about to road rage!), and even read your lips, which is useful for when you’re jamming out to Smash Mouth and need to tell the car to get you directions to the nearest Five Guys. Or whatever.
New cars are starting to come with a litany of sensors, everything from radar and ultrasonics to front- and rear-facing cameras. Using them in keep us safer in every situation is terrific. 30,000 people die in car accidents in the US every year and self-driving cars have the potential to save nearly all of them. Though we’re probably still a decade or two away from self-driving cars that can handle every situation, Nvidia’s solutions could make driving safer sooner than later.