Tesla argues autopilot warned driver before fatal crash
A trial in Miami federal court is examining a 2019 collision involving a Tesla Model S and a parked SUV in Key Largo, Florida.
The crash resulted in the death of a woman standing next to the SUV and serious injuries to her boyfriend.
Tesla argues that its Autopilot system functioned as intended and blames the driver, George McGee. McGee allegedly ran a stop sign and was distracted while reaching for a dropped cell phone.
The family of the deceased woman and her injured boyfriend have filed the lawsuit, scrutinizing Tesla’s semi-autonomous driving technology.
Evidence presented in court includes data from the Model S.
This data indicates McGee was accelerating 17 mph over the speed limit and began braking just 0.55 seconds before impact.
Tesla’s lawyer highlighted that McGee had safely navigated the same intersection nearly 50 times before the crash. He attributed the incident to a change in McGee’s behavior at the time.
The trial is expected to last three weeks. The outcome could potentially affect Tesla’s strategy regarding self-driving technology.
.source-ref{font-size:0.85em;color:#666;display:block;margin-top:1em;}a.ask-tia-citation-link:hover{color:#11628d !important;background:#e9f6f5 !important;border-color:#11628d !important;text-decoration:none !important;}@media only screen and (min-width:768px){a.ask-tia-citation-link{font-size:11px !important;}}🔗 Source: Bloomberg
The Miami trial is not the first high-profile case examining Tesla’s Autopilot system in fatal crashes, but part of a documented pattern with similar characteristics.
In 2018, a Tesla Model X on Autopilot crashed into a concrete barrier in California, killing the driver who had his hands off the wheel for six seconds before impact despite receiving multiple warnings1.
A year later in 2019, the NTSB confirmed another fatal crash when a Tesla Model 3 using Autopilot collided with a semi-trailer crossing a highway, with the driver’s hands off the wheel for the final 8 seconds2.
These incidents reflect a consistent challenge: Autopilot systems can fail to detect certain obstacles (barriers, crossing vehicles) while simultaneously allowing drivers to disengage despite warnings.
From 2021 to 2024, Tesla vehicles accounted for 53.9% of all reported autonomous vehicle incidents, demonstrating the widespread nature of these safety concerns beyond individual cases3.
The recurring pattern raises questions about whether Tesla’s driver monitoring systems are sufficient to prevent misuse, even when functioning as designed.
The critical 1.65-second window before the Florida crash highlights what safety experts call the “handoff problem,” the dangerous transition period when technology disengages and human drivers must suddenly take control.
Research shows human drivers typically need 3-7 seconds to regain situational awareness after disengaging from automated tasks, making the sub-2-second warning in this case potentially insufficient for effective intervention4.
Multiple Tesla crashes follow a similar pattern: drivers become inattentive (looking at phones, hands off wheel) while Autopilot is engaged, then fail to respond quickly enough when the system abruptly requires human intervention.
The Tesla trial highlights a critical question for semi-autonomous technology: whether systems that require constant human readiness to intervene can ever overcome the fundamental human tendency toward distraction and complacency when monitoring automated processes.
……Read full article on Tech in Asia
Technology
Comments
Leave a comment in Nestia App