Skip to main content

Audi pulls the curtain back on its self-driving car program

Audi pulls the curtain back on its self-driving car program

/

Autonomous Intelligent Driving is to Audi as Cruise is to GM, says the German automaker’s CTO

Share this story

Audi h-tron quattro concept at the Detroit Auto Show
Photo: Sean O’Kane / The Verge

Of all the luxury car brands, Audi has been the most aggressive in terms of putting semi-autonomous technology into its production cars. (See the A8 sedan, a version of which can’t be sold in the US thanks to its partially automated features.) Now the German automaker is offering a sneak peek of its effort to build fully driverless cars, as well as one of the partners it says will be instrumental in putting self-driving cars on the road by 2021.

Audi, which is owned by the Volkswagen Group, recently pledged to spend almost $16 billion (14 billion euros) on electric mobility and self-driving technology through 2023. Much of that work will take place at Autonomous Intelligent Driving (AID), a wholly owned subsidiary of Audi.

The group was founded a year-and-a-half ago, and today around 150 employees. Their headquarters is in Munich, where it also has 12 autonomous test vehicles operating on public roads. Most of the test vehicles are VW Golf hatchbacks, which speaks to AID’s key role as the urban autonomous driving technology supplier for all Volkswagen Group brands like VW, Audi, and Porsche. 

Alexandre Haag, the group’s chief technology officer, said AID is to Audi as Cruise is to General Motors, or Argo is to Ford. But rather than acquire an outside company to kickstart its AV program, the German automaker sought to build its self-driving team from scratch.

“Our goal is to develop the full Level 4 stack,” Haag told The Verge. (The Society of Automotive Engineers defines Level 4 as a car that completely drives itself from start to finish within a specifically designated area.) “The first application to be robo-taxi,” Haag added, “and in the long term provide the whole group with a self-driving stack for ownership vehicles, trucks, buses, food deliveries... everything in the long term.”

“Our goal is to develop the full Level 4 stack”

Like most AV programs, the Audi division uses software and deep learning-based approaches to process all the sensing data picked up by its suite of sensors, such as LIDAR point clouds, camera pixels, and radar echoes. This perception data models the vehicle’s environment, both near and far, by detecting objects, vehicles, pedestrians, and other challenging obstacles. 

That said, Audi is not as far ahead as other automakers in developing and deploying automated vehicles. Waymo, the self-driving division of Alphabet, just launched a very limited commercial robot taxi service outside of Phoenix, Arizona. GM says it will unveil its own ride-hailing service in San Francisco in 2019. Audi is aiming to deploy its robot taxis at scale in 2021, the same year targeted by Ford for its service.

“Frankly, I think there’s some hype out there,” Haag said. “I think we’re still some way from real scaling... I have a high opinion of Waymo, but I think [GM’s] Cruise is hyping a little bit more. I think they have a good team and are on a good path, but I’m quite sure if they release something in 2019, it’s going to be very marginal, very small. Maybe a dedicated lane, fixed-route shuttle, a few kilometers or something like this. But being able to do all of San Francisco, or even just Market Street, I really don’t see this happening next year.”

That said, Haag acknowledged the challenges of developing fully driverless cars, something he said all car companies are currently grappling with. Perception, or the car’s ability to “see” and correctly identify nearby objects, is incredibly difficult, as is predicting what those objects will do.

“When you’re at 95 percent, you’ve just scratched the surface”

“Getting to 90 percent [in perception] is fairly easy,” Haag said. “Getting to 95 percent starts to get interesting. And then you still need to go way beyond that. Nine point nine nine nine nine... Adding each nine is ten times harder. When you’re at 95 percent, you’ve just scratched the surface.”

To boost its efforts, Audi is bringing on some new suppliers to help with these challenges of perception and prediction. On Tuesday, AID announced that it will be working with Luminar, a Palo Alto-based startup that makes LIDAR sensors and perception software for autonomous vehicles. Luminar supplies Volvo and the Toyota Research Institute, as well as several other automakers it hasn’t revealed yet. LIDAR, the laser sensor that sends millions of laser points out per second and measures how long they take to bounce back, is seen as a key ingredient to autonomous driving.

Haag said he was impressed with the range and point-cloud density of Luminar’s LIDAR. “If you have a more powerful sensor, it makes your perception task easier,” he said. “And that’s where Luminar comes in.” AID’s vehicles will use Luminar’s sensors for long-range perception and detection, as well as shorter range LIDAR made by Velodyne for closer objects.

“It’s a big win,” said Austin Russell, the CEO of Luminar. His company has been working with Audi for over a year, but is only just now ready to talk about their partnership. As a relative newcomer to self-driving, Russell said that AID is more nimble than other AV operators that have been working on the technology for a decade or more.

“These guys don’t necessarily have a lot of the baggage to be encumbered by from software stacks that have been developed from previous years on different sensing types,” he said. Other operators can get bogged down by “legacy code that prevents them from being agile and move easily to new platforms, like ours,” Russell added.