- Home ›
- Uber's homebrew automation software blamed in deadly crash
Uber's homebrew automation software blamed in deadly crashby Nick Aziz
The crash that killed a pedestrian in Arizona has been blamed on Uber\'s seemingly shoddy autonomous driving software, according to investigators.
In any artificial intelligence system, classification is the crucial first step in deciding how to react to data coming from cameras, sensors, and other inputs. A first-generation semi-autonomous Tesla Autopilot system infamously failed to correctly classify a white semi trailer against a bright sky, in the accident that killed Joshua Brown, but the sensors and computational power available to that system -- as well as its stated purpose and intended use -- were entirely different than that of the Uber vehicle.
While Uber's rationale is sound -- having two automation systems both attempting to intervene could be problematic to say the very least -- they chose not to use the Volvo system even as an indirect input or as an alarm for the safety driver. NTSB says the Volvo system could have altered the driver, but it was effectively left shouting into the wind.
That decision is one of many that have lead observers to question diligence and thoroughness of Uber's approach to safety. Some have also questioned the company's decision to run single-occupant test vehicles, whereas most AV companies in the early stages of testing employ two technicians per car.
In a desperate race to catch up to Waymo, Google's self-driving car division on the cusp of commercializing self-driving taxis, Uber has found no shortage of controversy, including clashes with regulators in California, a not-at-fault collision that was nonetheless potentially avoidable, a videotaped incident of one of its cars blatantly failing to detect a red light, and most recently the deadly crash that lead to a nationwide halt to testing.
What's more, statistics released by authorities in California indicate that in 2017, Uber prototypes managed to drive only 13 miles between "disengagements", which are either computer- or driver-initiated hand-offs from autonomous mode to manual control. By contrast, Waymo managed to average 5,700 miles between disengagements in 2017, the majority of which were driver-triggered out of an abundance of caution. In the final quarter of 2017, Waymo reported approximately 1 disengagement per 30,000 miles.