Fender bender in Arizona illustrates Waymo’s commercialization challenge | Ars Technica Skip to main content Biz & IT Tech Science Policy Cars Gaming & Culture Store Forums Subscribe Close Navigate Store Subscribe Videos Features Reviews RSS Feeds Mobile Site About Ars Staff Directory Contact Us Advertise with Ars Reprints Filter by topic Biz & IT Tech Science Policy Cars Gaming & Culture Store Forums Settings Front page layout Grid List Site theme Black on white White on black Sign in Comment activity Sign up or login to join the discussions! Stay logged in | Having trouble? Sign up to comment and more Sign up Self-driving — Fender bender in Arizona illustrates Waymo’s commercialization challenge Self-driving systems won't necessarily make the same mistakes as human drivers. Timothy B. Lee - Apr 2, 2021 5:07 pm UTC Enlarge / A Waymo self-driving car in Silicon Valley in 2019. Sundry Photography / Getty reader comments 293 with 120 posters participating, including story author Share this story Share on Facebook Share on Twitter Share on Reddit A police report obtained by the Phoenix New Times this week reveals a minor Waymo-related crash that occurred last October but hadn't been publicly reported until now. Here's how the New Times describes the incident: A white Waymo minivan was traveling westbound in the middle of three westbound lanes on Chandler Boulevard, in autonomous mode, when it unexpectedly braked for no reason. A Waymo backup driver behind the wheel at the time told Chandler police that "all of a sudden the vehicle began to stop and gave a code to the effect of 'stop recommended' and came to a sudden stop without warning." A red Chevrolet Silverado pickup behind the vehicle swerved to the right but clipped its back panel, causing minor damage. Nobody was hurt. Overall, Waymo has a strong safety record. Waymo has racked up more than 20 million testing miles in Arizona, California, and other states. This is far more than any human being will drive in a lifetime. Waymo's vehicles have been involved in a relatively small number of crashes. These crashes have been overwhelmingly minor with no fatalities and few if any serious injuries. Waymo says that a large majority of those crashes have been the fault of the other driver. So it's very possible that Waymo's self-driving software is significantly safer than a human driver. Further Reading This Arizona college student has taken over 60 driverless Waymo rides At the same time, Waymo isn't acting like a company with a multi-year head start on potentially world-changing technology. Three years ago, Waymo announced plans to buy "up to" 20,000 electric Jaguars and 62,000 Pacifica minivans for its self-driving fleet. The company hasn't recently released numbers on its fleet size, but it's safe to say that the company is nowhere near hitting those numbers. The service territory for the Waymo One taxi service in suburban Phoenix hasn't expanded much since it launched two years ago. Waymo hasn't addressed the slow pace of expansion, but incidents like last October's fender-bender might help explain it. Advertisement It’s hard to be sure if self-driving technology is safe Rear-end collisions like this rarely get anyone killed, and Waymo likes to point out that Arizona law prohibits tailgating. In most rear-end crashes, the driver in the back is considered to be at fault. At the same time, it's obviously not ideal for a self-driving car to suddenly come to a stop in the middle of the road. More generally, Waymo's vehicles sometimes hesitate longer than a human would when they encounter complex situations they don't fully understand. Human drivers sometimes find this frustrating, and it occasionally leads to crashes. In January 2020, a Waymo vehicle unexpectedly stopped as it approached an intersection where the stoplight was green. A police officer in an unmarked vehicle couldn't stop in time and hit the Waymo vehicle from behind. Again, no one was seriously injured. It's difficult to know if this kind of thing happens more often with Waymo's vehicles than with human drivers. Minor fender benders aren't always reported to the police and may not be reflected in official crash statistics, overstating the safety of human drivers. By contrast, any crash involving cutting-edge self-driving technology is likely to attract public attention. The more serious problem for Waymo is that the company can't be sure that the idiosyncrasies of its self-driving software won't contribute to a more serious crash in the future. Human drivers cause a fatality about once every 100 million miles of driving—far more miles than Waymo has tested so far. If Waymo scaled up rapidly, it would be taking a risk that an unnoticed flaw in Waymo's programming could lead to someone getting killed. And crucially, self-driving cars are likely to make different types of mistakes than human drivers. So it's not sufficient to make a list of mistakes human drivers commonly make and verify that self-driving software avoids making them. You also need to figure out if self-driving cars will screw up in scenarios that human drivers deal with easily. And there may be no other way to find these scenarios than with lots and lots of testing. Waymo has logged far more testing miles than other companies in the US, but there's every reason to think Waymo's competitors will face this same dilemma as they move toward large-scale commercial deployments. By now, a number of companies have developed self-driving cars that can handle most situations correctly most of the time. But building a car that can go millions of miles without a significant mistake is hard. And proving it is even harder. reader comments 293 with 120 posters participating, including story author Share this story Share on Facebook Share on Twitter Share on Reddit Timothy B. Lee Timothy is a senior reporter covering tech policy, blockchain technologies and the future of transportation. He lives in Washington DC. Email timothy.lee@arstechnica.com // Twitter @binarybits Advertisement You must login or create an account to comment. Channel Ars Technica ← Previous story Next story → Related Stories Sponsored Stories Powered by Today on Ars Store Subscribe About Us RSS Feeds View Mobile Site Contact Us Staff Advertise with us Reprints Newsletter Signup Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox. Sign me up → CNMN Collection WIRED Media Group © 2021 Condé Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy. Your California Privacy Rights | Do Not Sell My Personal Information The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices