Skip to main content

Drive.ai wants to help autonomous cars talk with the people around them

Drive.ai wants to help autonomous cars talk with the people around them

/

With beeps and emoji!

Share this story

Drive.ai

If you are walking around Mountain View, California and stumble across a car that beeps at you like R2-D2, it’s probably thanks to Carol Reiley. She’s the co-founder and president of Drive.ai, an autonomous car startup that wants to help robot cars learn how to interact with people better — and maybe even deliver a world without honking.

"My interest in robotics lies in where technology intersects with humanity," says Reiley. "How do these cars actually interact with people?"

Autonomous car development has mostly focused on the technology of self-driving cars. They address challenges like bad weather or bicyclists; companies with grander ambitions aim to change the entire model of car ownership. Drive.ai wants to solve the technological problems too. The company is using "deep learning," a method for artificial intelligence to learn behaviors on their own instead of needing to be told how to do everything, to help vehicles learn how to drive. But the most interesting part of the company is its work on the human-robot interactions. Basically, Reiley and her team want to help self-driving cars talk to the people (pedestrians, cyclists, other drivers) around them — something she believes it’s essential to public acceptance of the technology.

pedestrians need to know what self-driving cars are about to do

There is a huge amount of unspoken interactions between drivers and the world around them — waving someone into a merge lane, for instance, or gesturing to a pedestrian to cross the street. It’s important for self-driving cars to engage in that interactions as well, Reiley says. And, since a self-driving car won’t have a face or hands, it needs another way to interact.

Drive.ai is working on LED signs on the vehicle that use text and emoji-like pictures to communicate. In California, the practice of lane-splitting — when a motorcyclist drives between two lanes of slow or stalled traffic — is legal. Lane-splitting can be dangerous for the motorcyclist; if a driver doesn’t see the motorcycle, a car might change lanes, blocking the motorcycle’s path and possibly causing an accident. That’s why, she says, "it’s really important to make it clear that the car sees you, and then you can act appropriately."

Auditory feedback is important too. A car horn can be used for a lot of purposes, everything from warning another driver that an accident is imminent to expressing support for a fundraiser. But horns don’t change direction or volume. "The horn is one of the worst designed features on the car," Reiley says. So Drive.ai is working on an advanced version of its auditory feedback, allowing the car to "see the context" of the situation and emit a "more socially appropriate" honk. (Google is working on something similar.)

Self-driving cars should use a "more socially appropriate" honk

It’s curious to think that driving a car and driving a car in traffic are two different things, but it makes sense: it’s one thing to know the rules of the road and how to follow them, but quite another to expect everyone else to do the same. A couple years ago, I had my first experience with adaptive cruise control in a Volvo on a visit to Massachusetts. The technology was fantastic for stop-and-go Boston traffic, but there was a big problem: the car left entirely too much space between me and the car in front. Of course, the vehicle was just trying to keep me safe, leaving room in case of sudden braking, but Boston’s drivers didn’t see that — they just saw a gap to exploit. The safety feature in fact just meant I got cut off more often.

That isn’t a problem that Drive.ai is looking to solve specifically, but it is an example of how autonomous cars will need to adjust their social behavior to fit in with human drivers.

Drive.ai’s first product will be a retrofit kit that it is looking to sell to fleet companies doing freight or people delivery, ride sharing, and the like. The cars will roll through the Drive.ai factory, have the kit installed, and then head out onto the road. The company is working with potential partners to define commonly used routes before expanding coverage areas to cover larger and larger areas.

"Our north star is to work with automakers" and big fleet companies, Reiley says. "We’d love to get this in consumer cars, but that timeline is too long. It’s very difficult to do a global deployment at scale. Our first steps are with business partnerships and we’ll be starting pilot studies in the next few months."

The company was granted an autonomous vehicle license to test cars on California roads and received $12 million in funding last year. It’s human-robot interface vehicles should begin hitting the roads of Mountain View soon.

"We’re just at the very start of a new industry," says Reiley. "The relationship of people and their cars are going to change." And, if Drive.ai is right, maybe the interactions between people and cars on the road will change too.

Correction 10:15am ET: A previous version of this article stated that Carol Reiley left Stanford University's AI lab to found Drive.ai. That was false; Reiley attended Johns Hopkins for grad school while the rest of her team came over from Stanford's AI lab to found Drive.ai. We regret the error, and the article has been updated.