Data logs suggest the driver repeatedly took her hands off the wheel and only put them back on for a few seconds at a time to silence the warnings.
The Tesla Model S driver injured in the recent Autopilot crash in Utah apparently ignored repeated warnings to keep her hands on the wheel, according to a police review of data logs.
The car was traveling around 60 mph when it slammed into the back of a stationary truck stopped at a red light. The driver reportedly admitted to looking down at her phone before her car collided with the truck, resulting in a broken foot.
A chronology of data records suggests she took her hands off the wheel in more than a dozen instances during the drive cycle. On two occasions, her hands were off the wheel for more than a minute and only returned to the wheel when the car illuminated a visual warning.
"Each time she put her hands back on the wheel, she took them back off the wheel after a few seconds," the report says.
In the minute and 22 seconds before the crash, she re-enabled Autosteer and, within just two seconds, took her hands off the steering wheel again.
"She did not touch the steering wheel for the next 80 seconds until the crash happened," the report says.
Acknowledging that drivers are repeatedly advised that Autopilot does not make vehicles 'autonomous,' and instructions to keep the hands on the wheel and eyes on the road at all times, the police department issued the Model S driver a citation for failure to keep proper lookout.
Some safety advocates have claimed Tesla should do more to clarify that Autopilot is not fully autonomous and modify the system to better monitor driver engagement.
What's actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death.— Elon Musk (@elonmusk) May 14, 2018
Tesla chief Elon Musk recently voiced frustration with negative media coverage, arguing that statistics are unequivocal that Autopilot improves safety and drivers aren't being misled into thinking the system is capable of fully autonomous operation.
"It is the opposite case," he said in a recent conference call. "When there is a serious accident, almost always, in fact maybe always, the case is that it is an experienced user, and the issue is more one of complacency ... It's thinking they know more about Autopilot than they do."