Tesla Alerted Driver 150 Times To Take The Wheel Before Crashing Into Cops: Report

Tesla is under a ton of investigations mostly related to its Autopilot/Full Self-Driving Beta software. The Wall Street Journal got a hold of some footage and onboard computer logs from crashes under investigation for involving first responder vehicles. This close look at just one of the cases should give everyone in the self-driving industry pause.

Tesla Autopilot Crash Analysis: Footage Suggests Reasons for Accidents With Emergency Vehicles | WSJ

The crash focused on by the Journal involved a man, reportedly impaired, engaging Autopilot while driving his 2019 Model X on a freeway through Montgomery County, Texas, on February 27, 2021. The Model X hit a police car with its emergency lights activated stopped in the right hand lane. The crash injured five officers, as well as sending the man the police had initially pulled over to the hospital.

These five officers are now suing Tesla, though Tesla says the responsibility for the crash lies with the allegedly impaired driver. But even accounting for an impaired driver, the facts of how the Model X behaved in this case are alarming. WSJ found the driver in question had to be reminded 150 times in a 34-minute period to put his hands on the wheel with one alert coming seconds before the crash. While the driver complies every time, he did nothing to avoid the obviously blocked lane.

Giving a driver 150 chances to behave properly and safely in the space of a little more than half-hour period seems excessive, but there’s another, more dangerous, seeming flaw in the Autopilot system. The 2019 Model X has both radar and cameras (Tesla removed the radar a few years ago, only to double back on that decision) that are very good at tracking moving vehicles. The radar is less great at it, however, and the system relies on the cameras to pick up that slack. The flashing lights of emergency vehicles can confuse the cameras, experts told WSJ. In this instance, Autopilot recognized there was something in the lane 2.5 seconds before impact while traveling 55 miles per hour. The system briefly attempts to slow down, and then entirely disengages moments before impact.

Tesla isn’t the only car company to have its self-driving software bump up against first responder situations. Robotaxis from both Waymo and Cruise have had difficulties navigating around emergency vehicles and emergency situations, though neither has experienced a crash and certainly nothing this catastrophic. Those companies are also limited to operating in certain parts of cities they operate in, like San Francisco, and are limited to the speeds they can reach.

Tesla is facing a laundry list of investigations from the Department of Justice, NHTSA, the California DMV, and the Securities and Exchanges Commission. That’s not to mention the multiple lawsuits Tesla faces from people hurt or killed in Tesla cars or experienced racism in Tesla factories.

You can watch the entire report at WSJ.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment