For the past decade, the easiest way to spot a self-driving car was to look for the distinctive spinning bucket mounted to its roof. The classic lidar design pioneered by Velodyne spins 64 lasers through 360 degrees, producing a three-dimensional view of the car’s surroundings from the reflected laser beams.
That complicated and bulky set-up has traditionally also been expensive. Velodyne’s US $75,000 lidar famously cost several times the sticker price of the Toyota Priuses that formed the nucleus of Google’s original self-driving car fleet.
Those days are long gone. Velodyne now sells its cheapest 16-laser lidar for $4,000, and a host of startups are nipping at its heels with solid-state lidars that could soon be as cheap as $100 each. Many of the new lidars work in fundamentally different ways to Velodyne’s bucket—a shift that brings new capabilities and new challenges.
With the notable exception of Tesla, almost every company working on self-driving vehicles uses three core sensor technologies: cameras to identify road users, signs and traffic lights; radars for accurate ranging out to long distances; and lidars for high-resolution 3D imaging. (Tesla insists that only cameras, radars, and ultrasonic sensors are necessary, although many in the industry think the company will integrate lidars in future vehicles).
“Cameras are in a very good place and will continue to get better, automotive radar has been well established… but lidar is an area that has been underexplored,” said Chris Urmson at CES. Urmson was long the head of Google’s self-driving car program and is now CEO of Aurora Innovation, an autonomous technology startup that recently announced partnerships with Volkswagen and Hyundai. “The most exciting thing about lidar is that suddenly, there are a whole lot of people working on it,” he said.
The primary driver for carmakers is to move away from sensitive electro-mechanical systems to something more durable, said Jim Zizelman, vice president of engineering and program management for Aptiv, a global automotive parts company: “Think about a mechanical spinning system going over potholes and through temperature extremes. A solid-state system should give orders of magnitude more reliability.”
CES this year saw many lidar companies demonstrating solid-state lidars. LeddarTech was showing a lidar system-on-a-chip that it says will enable $100 solid-state lidars at production volumes, while rival Innoviz unveiled a high-resolution lidar based on MEMS—tiny microelectromechanical mirrors that steer a single fixed laser.
Innoviz boasted that its unit has a field of view greater than 70 degrees. While that is better than some other solid-state lidars, it’s clearly less impressive than Velodyne’s 360-degree bucket. Solid-state lidars, then, generally have to build up a panoramic view by fusing data from multiple units mounted on the front, sides, and rear of a car. That requires additional computation but also gives car designers a little more freedom.
Legendary car designer Henrik Fisker was at CES talking about his latest vehicle, the Fisker EMotion, a luxury all-electric sedan that uses solid-state lidar from Quanergy. “For 100 years, the front end of cars has been designed around the radiator,” he told IEEE Spectrum. “Now we don’t need radiators. I thought, why don’t I design the face around the lidar? It was really liberating, to design around a brand new technology.”
A Quanergy lidar takes pride of place on the nose of the EMotion, with four other units around the vehicle. Aptiv’s test BMW has no fewer than nine lidars built into it. Toyota’s latest self-driving research vehicle, codenamed Platform 3.0, has a total of eight lidars, four from Luminar, a startup run by 22-year old entrepreneur Austin Russell.
Another approach to steering lasers in solid-state lidar is to use a phased array. Strobe, a start-up acquired by GM in October, probably uses one, along with an innovative frequency-modulated technology that lets the lidar measure the velocity as well as range of other road users.
“Phased array steering for lidar is interesting,” said Urmson. “But the challenge with that approach is the loss you get through the system. Can you really compress the sidelobes enough so that you can trust the measurements that come out? There’s a lot of interesting innovation in lidar and probably 20 to 50 startups trying different flavours.”
Some self-driving companies are already looking beyond lidar to other novel sensors. Mike Fleming was on one of the three winning teams of the DARPA Urban Challenge in 2007, and has been working since then on autonomous technology. He is now CEO of Torc, a company that has built self-driving Humvees for the U.S. military and is now turning its attention to passenger cars.
“There’s so much competition within the lidar space, we’re seeing more capabilities and the cost coming down,” he said during a test ride in Torc’s self-driving Lexus at CES. “We’ve used infrared cameras and lidars on different wavelengths with other projects, and believe that self-driving car companies will have to design their architecture to support sensors that we haven’t even thought of yet.”
Editor’s note: This story was updated on 16 January 2018 to clarify LeddarTech’s claims about its lidar system-on-a-chip.