A recent crash involving an Uber driverless car suggests autonomous software might take risks like a human driver.
The crash caused no major injuries. Apparently a human-driven car turning left failed to give way, hit the Uber car, and flipped it on its side. After a brief pause, Uber’s self-driving test fleet was back on public roads in Tempe, Pittsburgh and San Francisco early this week. But as the police report on the crash makes clear, the events were more complicated than just a failure to yield. The Uber Volvo SUV, outfitted with autonomous driving sensors, was heading south on a wide boulevard with a 60 km/h speed limit. It had two of the company’s test drivers in front and no paying passengers. The traffic light turned yellow as the vehicle entered an intersection. A human-driven Honda on the other side of the road was trying to make a left turn across traffic at the light. The driver thought the junction was clear and turned into the oncoming Uber SUV, according to the police report.
In a statement to police, Uber employee Patrick Murphy—who was in the car at the time—said the Volvo SUV was traveling just below the speed limit. He said the traffic signal turned yellow as the Uber vehicle entered the intersection. He then saw the Honda turning left, but “there was no time to react as there was a blind spot” created by traffic. The Honda hit Uber’s car, pushing it into a traffic pole and causing it to roll on its side. During the event, the Uber vehicle was in autonomous mode, a spokeswoman for the company and the police agree.
Self-driving cars haven’t often been accused of driving aggressively; on the contrary, oftener they’re criticized for driving too cautiously, slowing or stopping when human drivers would be more assertive. Autonomous vehicles operated by Waymo have been rear-ended due to that kind of issue, and the company have been working to make their system more human. Furthermore, eyewitness accounts often are unreliable, and other witnesses in the police report did not say that the Uber car was at fault—a position the police agree with. Still, Torres’ account raises the question of whether Uber’s self-driving sensors spotted the light turning yellow and maybe decided it could safely continue through the intersection. One of Uber’s self-driving SUVs ran a red light in San Francisco last year, and on five other occasions the company’s mapping system for its cars failed to recognise traffic lights in the area, the New York Times reported in February.
Uber’s problems show the potential hurdles to winning approval for autonomous vehicles from the public and regulators. AV makers are working to adjust software to handle “edge cases,” like unusual driving conditions.