Close Menu

Tesla Recalls and the Dangerous Use of Autopilot Features

Autopilot2

Tesla cars have become popular, largely based on their advanced technology. Although they are not self driving, and don’t claim to be, they are probably the closest thing out there to self driving cars. The problem is that their self-driving features may be too easy to engage, putting people on the roadway at risk, at least according to the government and a recent Tesla recall.

Using the Self Driving Features

Like many cars with self driving features, Tesla’s auto driving features are only supposed to be used on certain kinds of roads—usually roads that don’t have intersections, stop signs, and which generally are highway-type driving.

The problem, according to the recall, and according to the government, is that although Tesla warns people that the feature should only be used on certain kinds of roadways, the feature can actually be turned on and used by drivers on any road.

That means that many drivers are using the auto driving features, on roads where the feature is not safe to use, such as those with traffic lights, or lane lines that aren’t clear.

Deaths Stemming From Self Driving Autopilot

This is not a hypothetical problem; by some reports, eight deaths have been reported out of the use or misuse of Tesla’s auto driving feature on inappropriate roadways. One Tesla even drove into the bottom of a truck, using the autopilot feature. That is in addition to reports of Tesla’s crashing into parked cars, and other objects.

The autopilot features have also led to lawsuits by victims who say that Tesla is aware that the feature could be misused, but that Tesla hasn’t done much about it.

A Software Update

In response to a government recall, Tesla agreed to recall their cars (actually, Tesla referred to it only as a software update, not a full recall). But the software update won’t completely disable the feature from being used on roads that the feature shouldn’t be used on, something that many feel Tesla should institute.

The update will disengage the autopilot, if it senses that the driver is not paying attention or interacting with the vehicle, for an extended period of time. It also will send the driver additional warnings if the feature senses that it is being engaged on a road that it should be used on. But for now, that falls far short of upgrades and safety features that many watchdog groups and victims would like to see.

Additional Tesla Lawsuits

These lawsuits are aside from victims who have sued Tesla, alleging that the autopilot feature itself, even when properly engaged on proper roads, is defective. There have even been lawsuits that say just the use of the work autopilot is deceptive, leading people to believe that the car in fact does not need the involvement of or interaction with a human driver.

Whether you were injured by a self driving car, or a normal car, we can help. Contact the Tampa personal injury lawyers at Barbas, Nunez, Sanders, Butler & Hovsepian and schedule a consultation today.

Sources:

washingtonpost.com/technology/2023/12/16/tesla-autopilot-recall/

businessinsider.com/tesla-autopilot-recall-bolsters-lawsuit-claims-the-feature-dangerous-report-2023-12

forbes.com/advisor/legal/auto-accident/tesla-autopilot-lawsuit/

Facebook Twitter LinkedIn

© 2018 - 2024 Barbas, Nuñez, Sanders, Butler & Hovsepian, Attorneys and Counselors at Law. All rights reserved.
This law firm website & legal marketing is managed by MileMark Media.