Tesla Reportedly Alerted Driver 150 Times to Take the Wheel Before Crashing Into Cops

Tesla Reportedly Alerted Driver 150 Times to Take the Wheel Before Crashing Into Cops

Tesla is under a ton of investigations mostly related to its Autopilot/Full Self-Driving Beta software. The Wall Street Journal got a hold of some footage and onboard computer logs from crashes under investigation for involving first responder vehicles. This close look at just one of the cases should give everyone in the self-driving industry pause.

The crash focused on by the Journal involved a man, reportedly impaired, engaging Autopilot while driving his 2019 Model X on a freeway through Montgomery County, Texas, on February 27, 2021. The Model X hit a police car with its emergency lights activated stopped in the right-hand lane. The crash injured five officers, as well as sending the man the police had initially pulled over to the hospital.

These five officers are now suing Tesla, though Tesla says the responsibility for the crash lies with the allegedly impaired driver. But even accounting for an impaired driver, the facts of how the Model X behaved in this case are alarming. WSJ found the driver in question had to be reminded 150 times in a 34-minute period to put his hands on the wheel with one alert coming seconds before the crash. While the driver complies every time, he did nothing to avoid the obviously blocked lane.

Giving a driver 150 chances to behave properly and safely in the space of a little more than a half-hour period seems excessive, but there’s another, more dangerous, seeming flaw in the Autopilot system. The 2019 Model X has both radar and cameras (Tesla removed the radar a few years ago, only to double back on that decision) that are very good at tracking moving vehicles. The radar is less great at it, however, and the system relies on the cameras to pick up that slack. The flashing lights of emergency vehicles can confuse the cameras, experts told WSJ. In this instance, Autopilot recognized there was something in the lane 2.5 seconds before impact while travelling 88km/h. The system briefly attempts to slow down, and then entirely disengages moments before impact.

Tesla isn’t the only car company to have its self-driving software bump up against first responder situations. Robotaxis from both Waymo and Cruise have had difficulties navigating around emergency vehicles and emergency situations, though neither has experienced a crash and certainly nothing this catastrophic. Those companies are also limited to operating in certain parts of cities they operate in, like San Francisco, and are limited to the speeds they can reach.

Tesla is facing a laundry list of investigations from the Department of Justice, NHTSA, the California DMV, and the Securities and Exchanges Commission. That’s not to mention the multiple lawsuits Tesla faces from people hurt or killed in Tesla cars or experienced racism in Tesla factories.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.