Before America’s roadways become safer, they first could get more treacherous.
Vehicle automation is occurring at a breakneck pace. Already, self-parking and braking technologies in new cars are becoming commonplace. Driverless cars made by Google have already taken to the highways. Just last month, Freightliner and Daimler unveiled the first tractor-trailer approved to drive itself down Nevada freeways.
Proponents of these new technologies—including the National Highway Traffic Safety Administration—emphasize their ability to create safer roadways. After all, more than 30,000 Americans each year die in traffic crashes, and another 2.3 million were injured in 2013, the latest year of data available, per NHTSA.
“We see enormous benefits in reducing the role the human element plays in causing crashes,” said NHTSA spokesman Gordon Trowbridge.
And yet, much like the introduction of the motor car itself a century ago, the technology will present as many new challenges as it does solutions—ones that may require an overhaul of driving as we know it.
“One of the myths about automation is that as the level increases, you need less human expertise,” said Brian Reamer, a research scientist at the Massachusetts Institute of Technology and associate director of the New England University Transportation Center.
Airplanes are one prominent example. The planes may largely pilot themselves—but pilots have to have a deep knowledge of the complex systems in the cockpit in order to intervene when necessary.
Reamer points to the “Miracle on the Hudson,” the successful 2009 landing of US Airways Flight 1549 off the coast of Manhattan by Captain Chesley Sullenberger.
“Automation never would have done that,” at the time, Reamer said. “It doesn’t work outside programming bounds well.”
To wit, just as automation is good at performing some tasks that are difficult for humans—like steering through thick clouds—humans are good at performing some tasks difficult for computers. That’s one of the lessons from aviation, learnt after decades spent trying to remove the human element from the cockpit altogether only to realize that human pilots are, in fact, essential.
The trick, though, is keeping pilots—whether of planes, trains, or cars—engaged, especially as their workload does become more automated.
In fact, the Federal Aviation Administration in 2013 issued a safety alert recommending that pilots take control of their aircraft more often.
“[C]ontinuous use of autoflight systems could lead to degradation of the pilot’s ability to quickly recover the aircraft from an undesired state,” the FAA noted, adding “use of those systems does not reinforce a pilot’s knowledge and skills in manual flight operations.”
That is precisely the risk many now foresee with cars and drivers. How advanced a skill-set will drivers now need? What kinds of controls are needed to make sure they stay sharp, especially as the technology keeps advancing?
And critically, how to make sure that drivers—who often have just nanoseconds to intervene to avoid traffic accidents–stay engaged even as cars and trucks increasingly drive themselves?
“The problem is this ‘mushy middle’ of automation, where the human is disengaged enough to be really distracted but still have to perform some critical driving role,” said Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina, whose research focuses on the implications of emerging transportation technology.
Manufacturers are tackling this problem in different ways.
Mercedes, for one, has steering-wheel sensors on its S-Class vehicles. “Drivers are required to touch the steering wheel to indicate they are engaged with the vehicle…about every 10 to 15 seconds,” said a spokesperson from parent company Daimler, when the car is on its version of autopilot.
Of course, these are as much to protect car makers from liability in the case of an accident as they are to ensure safety. Many states and foreign countries require drivers to keep one hand on the steering wheel at all times anyway. And even these requirements can be easily exploited, as the German driver who taped a soda can to the steering wheel of his Mercedes to avoid having to keep a hand on it demonstrates.
“There’s a lot of debate and discomfort over whether to and who should certify these vehicles,” said Smith. “No one has the magic answer for what allows us to say, ‘yep, this is ready’.”
And the larger question is whether society will accept a potentially lower number of total traffic accidents, injuries, and fatalities, for a larger number caused by driver or systems error in a growing fleet of increasingly autonomous vehicles.
“There are going to be some unintended consequences and crashes that would otherwise probably not have happened,” said Smith. “But the number of other crashes which could be prevented as a result, I think, would be substantially lower.”
MIT’s Reamer said that it may take an overhaul of driver licensure—uniform federal standards, for example, with different licensure classes for different types of vehicles–and perhaps more uniformity across state traffic codes as well to achieve the best outcomes with autonomous cars. “Now, you have a major state and federal rights issue, a Congressional issue,” he said.
The sooner policymakers and car manufacturers acknowledge the scope of the issue, the better to ensure that the benefits of automation are captured.
“This so-called ‘mushy middle’ might be the only form of automation we see in our lifetimes,” said Reamer.