We use cookies on our websites. Information about cookies and how you can object to the use of cookies at any time or end their use can be found in our privacy policy.

Tesla's Autopilot feature resulted in a 40% drop in crashes

Tesla's Autopilot feature resulted in a 40% drop in crashes

For those who fear the emerging automotive technology, there's reassuring news about Tesla's self-driving "Autopilot" feature. An investigation into a crash in May of last year has revealed that the Autopilot function not only wasn't the cause of the crash, but it has actually resulted in 40% fewer crashes since implementation. Let's explore the technology's evolution, and the safety concerns that surround it.

What is Tesla's Autopilot?

First of all, Autopilot doesn't refer to a fully driverless car. It's actually a self-driving feature which assists and supplements drivers, but doesn't quite replace them entirely...yet.

Tesla's goals for the technology

Tesla thinks they have the hardware for fully self-driving vehicles and it's just a software challenge from here on in. As such, it is pursuing advancements aggressively with significant updates every few months. According to Elektrek, Tesla Motors CEO Elon Musk says that by the end of 2017, he feels "pretty good about the goal of a demonstration drive of full autonomy all the way from LA to New York. Basically from a home in LA to – let’s say – dropping you off in Time Square in New York and then having the car park itself". You wouldn't need to even charge the vehicle in the scenario he is envisioning.

Levels of self-driving ability

There's a lot of steps in-between going from a normal car to a fully self-driving one. In terms of hardware, software and functionality, a lot has to be innovated upon. There are currently five different levels of self-driving ability for Tesla vehicles. Here's what they all mean:

  • Level Zero is a normal car, where the driver has full control and the car can't do anything independently of the driver.
  • Level One is when a car has just one piece of self-driving technology, for example, an emergency braking system like many cars today do.
  • Level Two is when a car has two autonomous functions but a driver still must remain ready to take control, for example, having both cruise control and lane-centering technology.
  • Level Three is when drivers can give the car control of safety-critical functions while driving, but only in certain situations, and the driver must be present though they don't have to pay as much attention as in Level Two. It is a complicated grey area for lawmakers.
  • Level Four is when the car can drive itself in almost all scenarios, except for extreme weather or difficult roads. The driver doesn't have to pay any attention when it is switched on.
  • Level Five is when the driver only has to turn the car on and set a destination. The closest thing to this at the moment is Google's self-driving car project, Waymo, which has no steering wheel or pedals. This is what Tesla is currently aiming for by the end of 2017, as we mentioned before.

Safety and driver responsibility

A deadly crash ruled not Autopilot's fault

In May of last year, a Model S hit a tractor trailer in Florida while the Autopilot was on. The wreck killed the driver, and an investigation by the NHTSA (National Highway Traffic Safety Transportation Board) was launched. Today, that investigation is now over and the results have been made public.

The NHTSA has concluded that there were no defects in the Autopilot or the automatic emergency braking, so the crash was not the fault of the Tesla technology, as the system is intended as "an Advanced Driver Assistance System (ADAS) that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes." In this case, the driver failed to take control of the vehicle to prevent the crash according to both the NHTSA and Tesla.

Currently, whenever the Autopilot system is turned on, it advises drivers to “be prepared to take over at any time” and keep their hands on the wheel. The Autopilot system will prompt the driver of a car to place their hands on the wheel when required for safety reasons if a crash is imminent, and it will disable the Autopilot feature with its "strike out" system if the driver fails to do so. In the fatal crash last May, according to Tesla's thinking on the matter, neither "Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

While the beginning of this case hurt Tesla's brand and maybe confirmed some people's doubts about the technology, it turns out it wasn't the fault of Autopilot after all. And in the end, it revealed that Autopilot may even make cars safer.

Autopilot feature resulted in a 40% drop in crashes

That crash was just one particular case. To evaluate the relative safety of the Autopilot technology overall, we must look at a wider sample of data from the time before and after it was implemented. For the model years 2014-2016, Tesla Model S and X vehicles had 40% fewer crashes once autosteering was introduced.

As more software and hardware is developed to make cars self-driving, by Tesla and other companies, we will keep a close eye on the safety of this emerging technology.

We're one step closer to driverless cars. What do you think of Tesla's goal? Are you still unconvinced of the safety of the idea? Let us know in the comments below.

Recommended articles

1 Comment

Write new comment:
All changes will be saved. No drafts are saved when editing
Write new comment:
All changes will be saved. No drafts are saved when editing

  • Sorry, this is just a cop out. Autopilot failed to see a huge tractor trailer blocking the road in broad daylight and it's NOT an autopilot failure??? What is autopilot supposed to do if it can't detect a huge object on the road? Next they will tell us Tesla went off the bridge on autopilot, but it's not autopilot failure. BTW what does Tesla have to do with Android? Is Androidpit becoming a Tesla fanboy?