Google's futuristic self-driving cars needed some old-fashioned human intervention to avoid 11 crashes during testing on California roads, the company revealed Tuesday, results it says are encouraging but show the technology has yet to reach the goal of not needing someone behind the wheel. With Google's fleet logging tens of thousands of miles each month, the 11 instances would be the equivalent of a car having one event every three years, based on how much the average vehicle is driven in the U.S.
There were another 272 cases in which failures of the cars' software or onboard sensors forced the person who must be in the front seat just in case to grab the wheel during roughly a year of testing. Though Google did not release detailed scenarios, the problems included issues with the self-driving cars seeing traffic lights, yielding to pedestrians or committing traffic violations. There were also cases where intervention was needed because other drivers were reckless, and several dozen instances of an "unwanted maneuver" by Google's car.
"There's none where it was like, 'Holy cow, we just avoided a big wreck,'" said Chris Urmson, who heads Google's self-driving car project. During this phase of testing, Google cars usually stay below 35 mph, although they also drive on highways.
"We're seeing lots of improvement. But it's not quite ready yet," Urmson said. "That's exactly why we test our vehicles with a steering wheel and pedals."
Bryant Walker Smith, a professor at the University of South Carolina who closely follows self-driving car developments, said the rate of potential collisions was "not terribly high, but certainly not trivial." He said it remains difficult to gauge how Google's cars compare to accident rates among human drivers, since even the best data underreport minor collisions that are never reported to authorities.
While the problem rate is "impressively low," a trained safety driver should remain in the front seat, said Raj Rajkumar, an engineering professor at Carnegie Mellon University who specializes in self-driving cars. According to data in Google's report, a driver typically took control within one second of the car asking for help.
John Simpson, a frequent critic of Google who focuses on privacy issues for the nonprofit group Consumer Watchdog, said the report "underscores the need for a driver behind the steering wheel capable of taking control of the robot car."
Google has argued to California regulators that once the company concludes the cars are ready for the public to use, they should not need a steering wheel or pedals because human intervention would actually make them less safe. "It's unfathomable that Google is pushing back" on the need for a wheel and pedals, Simpson said.
Seven companies with permission to test self-driving prototypes on California roads were required to report to the state's Department of Motor Vehicles instances in which drivers had to take over due to technology problems or safety concerns.
Google released its report before the agency, or the other companies, in what it described as an effort to be transparent about its safety record. The company had lobbied against having to report "disengagements" from self-driving mode, saying the data could be misinterpreted. Google said its cars would have been responsible in eight of the 11 avoided accidents, according to computer modeling the company performed later. In two other cases, its cars would have hit a traffic cone. Google cars have been involved in nine collisions since September 2014. In each case, the other car was responsible, according to an analysis by researchers at Virginia Tech University.