The revelation last month that a fatal car cash involved Tesla’s “Autopilot” feature has sparked a debate over liability when it comes to assisted driving: Who’s legally at fault in a crash if a car is being somewhat controlled by a computer?
Getty Images
Tesla offered a statement following the May 7 fatal crash:
The vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.
The car crashed directly into the bottom of the trailer, and the entire top of the vehicle was “torn off by the force of the collision,” according to a local newspaper. The driver was killed.
Last month, the US National Highway Traffic Safety Administration (NHTSA) announced that they would investigate the crash. The truck driver involved in the crash claimed that the Tesla driver appeared to be watching a Harry Potter movie on a portable DVD player when the collision happened. According to the police investigation, however, the DVD player found in the Tesla was not running during the time of the crash.
One issue Tesla faces is that the company calls its assisted driving feature “Autopilot”, even though it doesn’t fully automate driving capabilities. The feature, which Tesla says is in beta, helps drivers stay within a lane, shift lanes and detects potential collisions. The NHTSA regulates these features and has a classification for different levels of car automation. Level 0 is your standard car, where the “driver is in complete and sole control of the primary vehicle”, and Level 4, the maximum, is described as “Full Self-Driving Automation”.
Tesla
The prevailing question over crashes involving Autopilot is whether the drivers are aware of the risk they are taking when driving with Autopilot turned on.
According to Gabriel Weiner, a US lawyer who maintains Stanford’s wiki on legislative and regulatory action related to autonomous driving, Tesla’s autopilot mode fits squarely into Level 2 of the NHTSA’s classification. “When you call something ‘Autopilot’, are you suggesting to drivers this is something like a Level 4, when in reality it’s a Level 2?” Weiner said.
The problem may be that the term autopilot may be just enough to lull drivers into the false sense that the car doesn’t need any user input, and can just simply drive itself. Ryan Calo, assistant professor of law at the University of Washington, said if drivers are deemed to be aware of the risk, it may let Tesla off the hook. “Because we’re talking about physical safety,” Calo said, “courts and regulators will likely hold Tesla to a higher standard.”
Tesla, for its part, says that it is continuously educating customers on the use of its vehicles’ features by reminding them that they’re responsible for remaining alert even when Autopilot is being used. In a blog post titled “Your Autopilot Has Arrived” the company clearly states that the driver is “still responsible, and ultimately in control of, the car”. The company also includes similar language in the owner’s manual of its Model S and Model X and there’s a prompt on the centre screen of the cars when Autopilot is engaged saying: “Please keep your hands on the wheel. Be prepared to take over at any time.”
In a 2014 interview with Bloomberg, Elon Musk, the founder and CEO of Tesla, compared Autopilot to the same feature used in aeroplanes, meaning that the pilot still has to be monitoring the vehicle and providing verification and validation.
“The onus is on the pilot to make sure they’re doing the right thing,” Musk said. “We’re not yet at a stage where you can go to sleep and wake up at your destination. We would have called it autonomous instead of autopilot if that was the case.”
When asked who would be responsible if a driver tried to use the Autopilot feature to switch lanes but instead crashed into a highway barrier, Musk said, “We’re going to be quite clear with customers that the responsibility remains with the driver.”
This differs from Google’s, Volvo’s and Mercedes’s stance on driverless cars and liability. Those companies have all said that they would take responsibility for any crashes involving driverless cars.
Tesla’s cars, however, collect and store more more data than your standard car which may help them paint a picture of exactly how certain crashes happened. “This [May 7 fatal] crash is unique in that there will be a lot more digital data, most of which Tesla hasn’t released that could provide further clarity,” Bryant Walker Smith, an assistant professor of law at South Carolina University, told Gizmodo. “But even before we get to Tesla, it’s important to acknowledge that the truck driver in this crash may be at least partly at fault.”
Still, it’s likely that cases will be brought against the company from different drivers. There have been two non-fatal crashes in the US reportedly linked to Autopilot, the most recent in Montana on July 11.
The legal question of liability will also depend on where crashes happen, whether they are in different countries or different states within a country. “This is very specific state by state, although Florida tends to have a broader view of liability,” Smith said.
People could also potentially bring claims against Tesla for Autopilot not performing as intended, resulting in a crash. If someone tried to use the automatic lane switcher, but ended up crashing into another car beside it, Tesla could very well end up being liable.
“The plaintiff or the injured person in this type of case would need to establish Tesla’s unreasonable conduct or the product’s unreasonable performance,” Smith said.
“Tesla, in defence, would point to the driver’s behaviour. They might say it was an assumption of risk that the driver, knowing the system was imperfect, chose to use it in this way. Or they may argue that the driver himself was simply careless in using the system.”
It’s still unclear if the fatal Florida crash will go to court, but it certainly won’t be the last of Tesla’s woes. Tesla told The New York Times that they have no plans to disable the feature, and that Autopilot is currently enabled in 700,000 cars.