Tesla Owners Use Autopilot Where It’s Not Designed To Work

According to Tesla, the driver-assist feature that it calls Autopilot is “intended for use on controlled-access highways” that have “a center divider, clear lane markings, and no cross traffic.” The owner’s manual also points out that it may not work well in areas with hills or sharp turns. Still, despite having the ability to limit where drivers can use Autopilot, it chooses not to. As the Washington Post found, that decision has led to at least eight serious and even deadly crashes.

Take, for example, victims Dillon Angulo and Naibel Benavides Leon:

After a long day of fishing in Key Largo, Fla., Dillon Angulo and Naibel Benavides Leon pulled to the side of the road and hopped out of their Chevy Tahoe to look at the stars. Suddenly, Angulo said, the “whole world just fell down.”

A Tesla driving on Autopilot crashed through a T intersection at about 70 mph and flung the young couple into the air, killing Benavides Leon and gravely injuring Angulo. In police body-camera footage obtained by The Washington Post, the shaken driver says he was “driving on cruise” and took his eyes off the road when he dropped his phone.

But the 2019 crash reveals a problem deeper than driver inattention. It occurred on a rural road where Tesla’s Autopilot technology was not designed to be used. Dash-cam footage captured by the Tesla and obtained exclusively by The Post shows the car blowing through a stop sign, a blinking light and five yellow signs warning that the road ends and drivers must turn left or right.

Following the 2016 crash that killed Joshua Brown, the National Transportation Safety Board has encouraged the National Highway Traffic Safety Administration to put limits on where driver-assistance features can be used. So far, though, NHTSA hasn’t, claiming that it would be too complicated and take too many resources to ensure that driver-assistance systems are only being used under the conditions they were originally designed for. Instead, NHTSA prefers to focus on trying to make drivers pay attention while using systems such as Autopilot.

Tesla, meanwhile, has said since 2018 that it sees no reason to limit where owners can use Autopilot because “the driver determines the acceptable operating environment.” Drivers don’t have perfect judgment, though, and the Post found about 40 serious or deadly crashes that took place since 2016 that involved a Tesla owner using Autopilot at the time of the crash. At least eight of those crashes took place on roads where the driver shouldn’t have been using Autopilot.

“If the manufacturer isn’t going to take safety seriously, it is up to the federal government to make sure that they are standing up for others to ensure safety,” NTSB chair Jennifer Homendy told the Post. But “safety does not seem to be the priority when it comes to Tesla.” She then took aim at the NHTSA, asking, “How many more people have to die before you take action as an agency?”

There’s a lot more to the article, so head on over to the Post and give the entire thing a read.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment