Q: My question is inspired by a near-miss I had with a Tesla driver on the freeway that barged in front of me moving toward the exit, driver on a cell phone with no hands on the wheel. Why doesn’t the law prohibit driver assistance technology from the misnamed “autopilot” all the way down to cruise control in congested conditions?
A: The problem with advanced driver assistance systems (ADAS) is right there in the name: the driver. And in the case of Tesla, the branding. How do you get away with naming your product “autopilot” or “full self-driving” when your product is, by its own description, not full self-driving or an autopilot? Tesla’s driver assistance system is better than many, but despite repeated promises from Elon Musk, the cars still don’t drive themselves.
That’s the hitch with ADAS. While you’re using it, you feel like the car has it under control. But it can relinquish that control at any moment, often because the situation is too complex for ADAS to handle. When it does give up control the driver has to be ready, and not all of them are. Some Tesla drivers, thinking the car has more ability than it does, have decided to watch a movie or take a nap, sometimes with disastrous results.
Before you put too much trust in ADAS, take a look at the reviews of the top performing systems. Even the best ADAS will let you down. They’ll veer out of their lane, fail to see a pedestrian crossing the street, and crash into stopped vehicles.
Despite the imperfections, ADAS reduces crashes. All those things I just mentioned; humans do them too, and at a higher rate. It’s tough to figure out exactly how much higher though. There isn’t a national database that makes a fair comparison of ADAS and non-ADAS car crashes. The National Highway Traffic Safety Administration has begun tracking ADAS crashes, but the information is self-reported by car companies. Tesla appears to perform the worst in this data, but it’s probably a reflection of their more comprehensive data collection capabilities than a comparative lack of safety.
Tesla reports that drivers using Autopilot travel over five million miles per crash, compared to a US average of one crash per 670,000 miles, but Tesla’s data sources make this more of an apples to baseballs comparison. Independent research shows ADAS equipped vehicles reduce injury crashes by 27 percent and property damage crashes by 19 percent.
Drivers who treat ADAS as it’s intended, for assistance, get a safety benefit from the system; drivers who use it while riding in the back seat (it’s happened) put themselves and others at risk. But banning driver assistance features in congested conditions, because some drivers misuse the system, would take away safety tools where drivers most benefit from them. For example, front-to-rear crashes, common in heavy traffic, are cut in half by forward collision avoidance systems.
Some of the problem may be that as ADAS get better, drivers become technology complacent, putting too much trust in the system. Currently there are no requirements for car dealers to train, or even explain, ADAS to car buyers. Perhaps the solution isn’t to prohibit the use of these systems, but to have training built into ADAS equipped cars so that drivers can’t turn it on until they know what the system can, and more importantly, can’t do.
No matter what features a car has, (or what Tesla names it) you presently can’t buy a full self-driving car. It’s up to us to drive responsibly however much help a car is giving us.