“New York would need 30 percent fewer vehicles if the taxi fleet, even with human drivers, is managed better,” Carlo Ratti, the director of MIT's Senseable City Lab tells IEEE Spectrum. That’s a big savings, both in taxis and in the space they take up on city streets. New York’s 14,000-odd taxis log some 500,000 trips a day.
The technology would seem to help the beleaguered taxi business fend off private ride-hailing services, like Uber and Lyft. They have their own algorithms, optimized partly to match drivers and passengers and partly to pool ride-sharing customers.
Waymo CEO John Krafcik announced last week that the company would be launching a driverless taxi service in Phoenix later this year. An application Waymo filed with the California Department of Motor Vehicles (DMV) for driverless testing, obtained by IEEE Spectrum using public record laws, reveals more about how that service might work.
Waymo is already operating a fully driverless pilot test in Arizona, where companies do not have to seek permission for self-driving cars, with or without human safety operators, or report on their progress. It’s a different matter in California, where many self-driving companies are based. In April, the state’s DMV started accepting applications for fully driverless testing. So far, the DMV has received two applications—one from Waymo, an Alphabet company, and the other from U.S./China startup JingChi.ai.
Self-driving vehicle startup Drive.ai has only been around since 2015, but has moved aggressively toward getting autonomous cars out into the world to do useful things safely and efficiently. Drive struck a partnership with Lyft last September to test its technology with the ride-sharing service in San Francisco, and this week, the company announced an on-demand self-driving car service in Frisco, Texas, just north of Dallas.
Starting in July, 10,000 local residents in a small area consisting mostly of office parks and restaurants will gradually receive access to Drive.ai's app, which they'll be able to use to hail themselves an autonomous car to drive them around (for free). A few fixed routes will connect specific locations. If everything goes well after six months, the company will add more routes in other areas.
Drive.ai is not the first self-driving car company to run a pilot, but there are a few things that make its effort particularly interesting. First is the introduction of hardware and software to allow autonomous vehicles to communicate with pedestrians and other drivers, a desperately needed capability that we haven't seen before. And second, Drive will implement a safety system with remote "tele-choice operators" to add a touch of humanity to tricky situations—and keep humans in the loop even after safety drivers are eventually removed from the vehicles.
We spent a very hot Monday in Frisco at Drive.ai's launch event, took a demo ride and talked with as many engineers as we could to bring you the details on all the new stuff that Drive has been working on.
One of the truisms of the self-driving car business is that you can’t begin to function properly without super-detailed, constantly updated digital maps that show buildings, trees, and other features.
That might seem no problem at all if you’re a Google spinoff called Waymo. After all, your corporate parent possesses vast mapping capabilities, and besides, you’re driving in your home turf—Mountain View, or maybe Phoenix. But how can even mighty Google map every last country lane, then freshen up the data every month or two, so a car won’t be surprised to find that a freshly planted cornfield is now knee-high?
Israel-based Innoviz has announced that it will supply solid-state lidar to BMW. The device, along with radar and other systems, will be incorporated into a self-driving package from Magna, a major autosupplier.
Innoviz says that when volume production begins, the lidar’s price should drop to the hundreds of dollars, down from the “single-digit thousands” that today’s test units go for. The company says that it can now make several thousand units a month on its existing assembly line, in Israel, and that it’s building another line in China.
The company argues that today’s deal with BMW vindicates the solid-state approach to lidar, in which the laser beam is steered without machinery. Innoviz does the trick with microscopic, moveable mirrors. Most recent lidar startups also use solid-state approaches.
Automatic emergency braking that can help cars avoid hitting pedestrians could become standard in many cars in the coming years. But a new study suggests such safety systems will need sensor coverage spanning almost 180 degrees in front of the car to avoid colliding with faster-moving cyclists.
In the coming months, an unnamed manufacturer will bring an electric car to market that offers wireless charging from WiTricity, Alex Gruzen, the company’s chief executive, tells IEEE Spectrum.
Unnamed, yes, but not utterly unguessable. Among the companies that have demonstrated wireless charging are BMW and Hyundai. And, though there are other wireless charging companies out there—Qualcomm, for example—Hyundai has explicitly named WiTricity as the supplier of the system it showed on its new Kona EV last week at the International Geneva Motor Show. Other companies known to be working with WiTricity include Honda, Nissan, and Toyota.
Luminar, a lidar startup founded by a 16-year-old, has come of age. Founder Austin Russell is now 23, and he tells IEEE Spectrum that his company has started mass production.
“This year we will produce 5000 units per quarter, enough to equip every autonomous car unit on the road,” he said, during a visit to our offices in Manhattan on Monday. “We had been using optics PhDs to hand-assemble them; by year’s end one’ll be coming off the line every 8 minutes.”
That’s an achievement, and Russell has evidently kept the production milestone under his hat for a while. After all, we’re 12 days into the second quarter. Russell is good at staying under the radar, having kept Luminar in stealth mode for its first five years. And, at 6 feet, 4 inches, Russell has to work at being inconspicuous.
Lidar startups have come up like mushrooms—Russell says he’s tracking about 60 of them—but up to now none of them could match production numbers with the industry’s pioneer, Velodyne, whose roof-mounted rotating beacon has become the identifying mark of the self-driving car. Many of those startups haven’t even got a working prototype to show off.
Six months ago Toyota announced that its experimental self-driving car was using Luminar's lidar. "Three other major auto makers have committed to using our platform for all their development," Russell says. "These fleets will ultimately evolve into serial production models for the market."
Luminar’s set can see 250 meters down the road, 50 m more than the prototype could manage a year ago. That would provide plenty of time for a self-driving car to react to events, even at highway speeds.
The Uber taxi that killed a pedestrian last month was apparently doing 60 kilometers per hour (38 mph). At that speed, Russell says, his company’s lidar can give about 14 seconds’ warning. To be sure, it’s not yet clear whether the Uber accident had anything to do with the lidar’s capabilities.
When I last met up with Russell, a year ago, he had one of the handful of prototypes his company had made under his arm. It was a rather substantial machine, but today’s production version is substantially smaller and lighter.
It works by splitting a laser beam into two parts, one for each of two windows, placed side-by-side at an angle. Each covers a 60-degree field, for a total of 120 degrees; to give a car full, 360-degree coverage with a certain amount of redundancy, you’d probably want four units. Microscopic moving mirrors steer the beam through the field of coverage. Paired with the laser is a photoreceptor that scans the reflected light.
The default setting is to focus on the horizon, at 10 frames per second. It can scan as slowly as 1 frame per second for “absurdly high resolution,” Russell says, or go up to 20 Hertz, for equally absurd quickness, at the cost of resolution. The tradeoff can be specified by the car’s software.
But the real trick is simple, brute-force laser power. Luminar can shine its single pencil of light brighter than others do because it works at a wavelength of 1550 nanometers, which hardly can penetrate to the retina. Our own Evan Ackerman took a deep dive into the optics back in July.
The wavelength can be handled only by the compound semiconductor indium gallium arsenide, a costly material. Up until now only the military could afford to work with it. But Luminar’s chief talking point has been that the lidar feature that car makers most need to optimize is performance, not price. That’s also been the implicit position of industry pioneer Velodyne, which has charged upwards of US $70,000 for its most capable models. Tellingly Velodyne has not yet specified the price it’ll ask for its newest and most capable lidar, the VLS-128.
“A 3-inch array of inGaAs used to cost $30,000,” Russell says. “But we use very little—about the width of a human hair. It costs us $3 to build the entire receiver module. Of course, what it costs us to make is not the same as what we charge for it!”
Information continues to emerge about the automated vehicle in Uber’s fleet that fatally struck a pedestrian the night of 19 March 2018 in Tempe, Ariz. While Tempe police chief Sylvia Moir cautiously speculated that neither a human nor an automated driver could have avoided the crash based on video of the incident, she refused to rule out charges for the backup driver.
This was a bit puzzling—how can a driver be at fault in an unavoidable crash? Then, Tempe police released video of the crash on Wednesday night.
In the video, taken from the car’s forward-looking dash cam, the pedestrian seems to appear from nowhere wearing all black, on a dark street, outside of a crosswalk—a seemingly impossible situation for a driver. Meanwhile, the camera recording the inside of the vehicle showed the test driver looking down at something near the console for more than 5 continuous seconds, immediately before impact. Traveling at 40 miles per hour, the vehicle would have covered the length of a football field in that time.
Warning: This video contains content that may be graphic or disturbing.
This is a guest post. The views expressed in this article are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.
Uber’s accident earlier this week—the first fatality involving a pedestrian and an autonomous car—already has fingers pointing to various parties, though it’s still too early to tell who’s responsible. The plot thickens every day with new rumors and information.
A woman, jaywalking at night. Walking a bicycle. Possibly homeless. On a busy street known for partying. In Arizona, with permissive laws. Involving Uber, with permissive ethics. Which had hired an ex-felon to be the safety driver in the autonomous vehicle (AV).
As we wait for a full investigation, we can start untangling the strands of responsibility, which include the following possibilities.