In one of my previous posts i briefed on the classification of automation in a driving system. As an example i quoted Tesla’s Autopilot is at level 2 of automation. But why Autopilot is not at level 3 or above as it is assumed by many today (July 2019)?
This post is not to show any negative shade of Autopilot. It is to bring awareness on uncertainties we might end up while using Autopilot.
As i mentioned about the features of Autopilot in older posts, they are still at Level 2 of automation because of the uncertainties we might come across while driving and the impossibility of Autopilot to alert the driver to intervene in all those possible cases of mishaps. Though vehicle takes care of lateral and longitudinal control on its own, environment monitoring has to be done by the human as well and cannot be completely ignored. It is because of this reason, Tesla recommends to read vehicle manual thoroughly before using any feature. Below snapshots from manual says enough about itself.
But something unique with Autosteer!
Autosteer has an additional capability of taking a fallback action in case driver does not revert the control. When Autosteer is active, driver has to hold the steering wheel so as to take back control at any time. If hands are not detected, driver is alerted with a warning and chime with the frequency of chime increasing over period of time. If hands are not detected even after repeated warnings, Autosteer will get disabled for the rest of that drive and expects the driver to drive manually using steering wheel. If hands are not detected even after Autosteer got disabled, vehicle slows down to stop. This sounds like a level 4 feature right? But it cannot take this fallback in all the cases. What if there was a cross divider in front all of a sudden? It can only alert at its best but not fallback actions. This kind of belief had caused accidents in the past and March 2018 accident can be well related, which Tesla justified as below:
“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
In below snap of another example, we can see the vehicle following the lanes and it was not aware of the barrier in front. You can see new dotted lanes put to split current lane from barrier, but vehicle was following the previous lane markings, eventually hitting the barrier.
In another such incident, there was a truck parked on side. When Cruise Control is active, it traces front vehicle and maintains the user set value of “follow distance”. If no vehicle is in front, user set cruise speed will be maintained. When any stationary vehicle is parked in front and gets in vicinity, Tesla doesn’t know it is a stationary vehicle initially and tries to maintain the follow distance settings. By the time it confirms it is a stationary obstacle, it is too late and hits the truck. Tesla manual clearly mentions this scenario just like other warnings.
All the shortcomings quoted above are applicable to other semi-autonomous driving systems as well. It is not that just Tesla is facing these situations. In short, Autopilot expects an attentive driver just like other systems.
Tesla cars built after Oct 2016 have advanced hardware capability already available in them to support Autopilot features, that get better day by day. Autopilot may reach level 3 in near future as Tesla’s fleet learning is actively improving from its vehicles data. Tesla’s plan on Robotaxi service by 2020 require features at Level 4 automation or above, but Tesla features are still at Level 2. Whatever it takes, have a deep look at the Manual before your belief is at stake.
No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.