
Bloomberg UK – Tesla Failed to Stop Autopilot Misuse, Safety Expert Testifies
See original article by Madlin Mekelburg at Bloomberg UK
Tesla Inc. hasn’t done enough to protect against drivers misusing its Autopilot system, a safety expert testified at a trial over a 2019 fatal collision.
Mary “Missy” Cummings, an engineering professor at George Mason University, told jurors in Miami federal court that the Tesla owner’s manual, which contains critical warnings about how the system works, is difficult for drivers to access.
She also said that prior to the crash, the company was having problems with drivers ignoring computer-generated warnings and had not embraced so-called geo-fencing already in use by other car makers to block drivers from activating driver-assistance functions on roads they’re not designed for.
led, and Dillon Angulo, who was seriously injured when a Tesla Model S went through a T-intersection in Key Largo and off the pavement, striking their parked Chevrolet Tahoe as they were standing next to it.
The plaintiffs’ lawyers allege that Tesla’s driver-assistance system was defective and that the company failed to warn users about its limitations. Tesla maintains that the crash was caused by driver error, a defense the company has successfully used to win two previous California trials when Autopilot was blamed for accidents.
George McGee, the driver of the Model S, had engaged the driver-assistance system, but had dropped his mobile phone and wasn’t watching the road while reaching for the device on the floorboard.
Lawyers for Angulo and the estate of Benavides Leon told the jury that the collision was a “preventable tragedy” and that the automated system built into the car failed to respond when it detected the end of the roadway, regardless of how McGee was driving.
They have repeatedly shown jurors augmented video clips captured by cameras on the car that show the system identifying the edge of the road, paint on the roadway indicating a stop sign, the Tahoe parked off road and a pedestrian standing nearby.
But Tesla argues that no technology that was on the market in 2019 would have been able to prevent the crash, and that McGee was fully at fault because he was pressing the accelerator and overriding the vehicle’s adaptive cruise control before he went off the road.
Why It’s So Hard to Make a Safe Self-Driving Car: QuickTake
Cummings was asked by Schreiber about a letter to NHTSA in which Tesla asserted that “Autopilot has the most robust set of warnings against driver misuses and abuse of any feature ever deployed in the automotive industry.”
She told the jury, “I saw no evidence that would back up this claim that they have the most robust set of warnings.”
When Cummings was appointed as a safety adviser for NHTSA in 2021, Musk called her “extremely biased against Tesla” and Tesla fans signed a petition against her.
Cummings has served as an expert witness in at least two other lawsuits against Tesla related to the Autopilot system, according to court filings.
The professor said McGee was very clear speaking after the accident that he thought this car was his copilot and that it would stop for obstacles in the road. Like many Tesla drivers, she said, McGee felt he could rely on Autopilot to navigate when he dropped his phone.
“The car is doing a good job of driving so I’m going to reach down and pick it up because my copilot is driving,” she said.
The case is Benavides v. Tesla, 1:21-cv-21940, US District Court, Southern District of Florida (Miami).
See original article by Madlin Mekelburg at Bloomberg UK