On the afternoon of May 11, a Tesla Model S rear-ended a fire truck stopped at a traffic light in South Jordan, Utah. According to the local police, the 28-year-old driver admitted she was speeding at the time of the collision and, more disturbingly, she was checking messages on her phone.
A simple case of driver stupidity, driver distraction, right? Not so fast. The driver had engaged Tesla’s Autopilot mode just prior to the incident, trusting the car’s onboard tech to take care of all the driving.
As more and more drivers similarly put misplaced trust in “semi-autonomous” driving systems like Autopilot, and sometimes the mistake costs lives.
The Human Problem
There is no such thing as a “semi-autonomous” car.
While today’s vehicles can handle some driving tasks—like staying within their lane, keeping in step with traffic, and avoiding conflict with other motorists on the road—they are not designed to replace the driver completely. They need us to be ready to take the wheel at a moment’s notice if the car’s autopilot system fails.
“That is the price of using these systems,” says Robert Molloy, director of the Office of Highway Safety at the National Transportation Safety Board (NTSB). But the more cars can do by themselves, Malloy says, the more people trust them and become too confident in the computer’s capabilities.
“The better systems like Tesla’s Autopilot and GM’s SuperCruise get at controlling your vehicle, the faster they lull you into a false sense of security. It’s human nature.”
The result are scores of drivers whose attention wanders from the road and instead start checking email, sending texts, or even grab some shut-eye. Despite some ambitious claims about what driver assist technology can do, car companies are well aware of its limitations and the fact that drivers need to stay ready at the wheel. Autopilot and its competitors incorporate visual or audio warnings to scold drivers who take their hands off the wheel. So far, though, that hasn’t been enough.
Telemetry data from the Model S crash in Utah demonstrates this fact. According to a email statement from Tesla supplied to the South Jourdan Police, the driver repeatedly engaged and disengaged its Autosteer and Traffic Aware Cruise Control while traveling around the suburbs of Salt Lake City, as if testing the limits of the system, and took her hands off the wheel for long periods of time while Autopilot was engaged.
“On two such occasions, she had her hands off the wheel for more than a minute each time, and her hands only came back on after a visual alert was provided,” Tesla said.
Each time, the driver put her hands back on the wheel just long enough to satisfy Autopilot’s driver status monitor. About 1 minute and 22 seconds before hitting the truck, the driver enabled Autopilot and, within seconds, took her hands off the steering wheel. She did not touch the wheel again or take any action to avoid the collision, witnesses say.
Luckily, the driver only broke an ankle and no one else was hurt. Others who have misused Tesla’s Autopilot have not been so lucky. In mid-March, a man was killed when his Tesla Model X struck a concrete median in Mountain View, California. Data shows that he also took his hands off-wheel for long periods of time and ignored repeated warnings from the car to resume control of the vehicle.
New Tech, Old Laws
Video capturing a driver sitting in the passenger seat (in England) while the Tesla drives itself on the M1 motorway in April, 2018.
States have spent years putting in new laws against distracted driving, but with driver assist technology, new problems are surfacing where people misuse the technology because they trust it to do too much. Some law enforcement experts think this means we need new laws.
In the Utah case, the driver received a traffic citation for “failure to keep a proper lookout,” a rule designed to make sure drivers were engaged with their surroundings. Though the case is still pending, she must still go to trial and could get a maximum penalty of up to a $2,500 fine, up to 12 months in jail, and up to 6 months loss of license.
However, that law wasn’t written with partially automated systems in mind. As far as this new technology is concerned, the European Union has gone as far as enacting specific laws for driver assist technology that require drivers to remain in control of their vehicles at all times and keep their hands on the steering wheel. “They cannot remove their hands from the wheel for longer than 20 seconds,” Malloy says.
In late April, a British motorist filmed himself sitting in the passenger seat of his Tesla Model S while Autopilot chauffeured him down the M1 Motorway at speed. The driver pled guilty to “dangerous driving.” According to reports, he was banned from driving for 18 months and will be required to pay a £1,800 fine, carry out 10 days rehabilitation, and perform 100 hours of community service.
However, this errant driver got caught only after the video posted to social media went viral. That’s when law enforcement contacted and charged him with the crime. For this reason, Bobbie Seppelt, a research scientist at Massachusetts Institute of Technology Age Lab and chair of autonomous driving for the Society of Automotive Engineers, doesn’t believe new regulations will have much effect.
“Enforcing laws prohibiting a person from misusing driver-assists would be difficult in the same way it has been difficult to regulate cell phone use,” Seppelt says. “Often, that misuse is on a second-by-second basis, which is hard to catch.”
Seppelt also believes laws limiting the use of these features could curtail their adoption, and thus put a damper on their potential safety benefits, which she feels far outweigh their faults.
Where do we go from here?
The cause isn’t hopeless. But in order to keep highways safe, drivers need to better understand driver assist technology and carmakers need to improve transparency as human mobility evolves.
“First, we have to better educate the consumer of what the technology is capable of doing,” says Deborah Hersman, CEO of the National Safety Council. “And do it in terminology people can reference and understand without the marketing hype.”
Second, automakers need to design the product that encourages responsible use. “If your hands are off-wheel too long, you get a warning,” says MIT’s Seppelt. “The next time you engage the system, the amount of time you spend off-wheel before receiving a warning is shortened and gets shorter and shorter every time you abuse the feature until you’re locked out of the system completely. These technologies are a privilege, not a right.”
Advertisement – Continue Reading Below
Third, and most importantly, we should make advanced driver-monitoring systems mandatory for all cars. Safety advocates believe it’s the only way to stop people from abusing these technologies.
“As we monitor the technology, the technology should be watching us,” says Seppelt. “Systems that follow a driver’s eye and body movements, determining where he or she is looking and what his or her head and body movements suggest about focus and state of mind and body.”
Some experts want to even go a step further, saying that systems should monitor a driver’s vital signs—specifically pulse rate, temperature, respiration rate, and blood pressure. They would indicate the state of a patient’s essential body functions in case of a medical emergency.
“Such a safety backup will ensure, as people get comfortable using driving systems, or if they otherwise disposed [i.e., drunk, drowsy or having a medical emergency], that human or robot is managing the full-driving task,” says Seppelt.
On the flipside, Volvo and General Motors do employ eye-tracking in their partially automated driving systems to make ensure drivers are paying attention while using driverless technology.
But no matter how much tech you pack inside of a car, humans will remain the ultimate x-factor.
“Look at drunk driving,” says Hersman. “We been battling it for decades. Yet, still people driver drunk every day.”
Until humans can catch up with the technology powering their vehicles, accidents will happen.