A Tesla Model 3 crash in Utah in 2022, which killed motorcyclist Landon Embry, is raising safety concerns about the company’s Autopilot system. Two years after the accident, Embry’s family is suing Tesla, claiming the Autopilot failed to detect and avoid the motorcycle.
The lawsuit filed in state court in Salt Lake City last week argues that the vehicle’s advanced sensors and cameras should have detected the motorcycle and prevented the fatal crash. The impact of the accident threw Embry from his bike, and he lost his life at the scene. This highlights the risks of using advanced technology like Autopilot in real-world driving.
Also, the lawsuit states that the Tesla Model 3 driver was too tired to drive safely. It argues that the driver’s impaired state contributed to the crash.
“A reasonably prudent driver, or adequate auto braking system, would have, and could have slowed or stopped without colliding with the motorcycle,” the complaint said. However, Tesla didn’t respond to the comment.
Multiple Incidents Involving Tesla’s Autopilot
This is not the first time that Tesla’s Autopilot and Full Self-Driving (FSD) system has been involved in an accident.
In April 2024, another fatal accident occurred involving a Tesla Model S in Seattle. The car was operating in Full Self-Driving mode when it collided with and killed a 28-year-old motorcyclist. This tragic event, like the one in Utah, has increased concerns and scrutiny about the safety and effectiveness of Tesla’s driver assistance technologies.
Moreover, data from the National Highway Traffic Safety Administration shows that advanced driver assistance systems (ADAS) have caused more accidents. In 2022, nearly 400 crashes in the US involved these technologies, with six deaths and five serious injuries. Tesla’s Autopilot and Full Self-Driving featured in 273 of these crashes, including five fatal ones.
Adding to the pressure on Tesla, the company recently settled a lawsuit related to a 2018 crash involving a Model X. In that case, the car, operating on Autopilot, strayed off a highway and killed an Apple engineer. The settlement highlights ongoing concerns about the effectiveness of Tesla’s safety features and the company’s accountability when things go wrong.
Final Thoughts
As Tesla faces more lawsuits, a big question arises: Are these advanced driver assistance systems really safe for everyday use? While companies like Tesla claim that these technologies make driving safer and more efficient, their real-life performance often causes concern. Critics argue that relying too much on these systems can create a false sense of security, as they are not yet perfect.
The case of Landon Embry and other similar incidents show the need for more rigorous testing and improved safety standards for autonomous driving technologies. As these technologies advance, manufacturers and regulators must thoroughly test them and ensure their reliability before widespread road use.