How to Make Self-Driving Cars Safer on Roads with Humans

October 18, 2018

As the National Highway Traffic Safety Administration and Congress press for removing safety regulations in order to fast-track the introduction of highly automated vehicles, human factors/ergonomics experts are recommending requiring important testing and driver support.

The Human Factors and Ergonomics Society (HFES) says autonomous vehicles must and can be made safer. But the group says a serious flaw in current autonomous software that ignores how humans interact with technology must be fixed.

HFES members, who include psychologists and other scientists, designers and engineers, say humans often perform more poorly when using automated systems. People are also not good at monitoring automation, which leads to increases in distracted driving. they warn that instead of making drivers safer, autonomous vehicles can actually decrease safety by lowering drivers’ understanding of what is happening, even when they are paying attention. As a result, drivers are unable to take control when the car can’t handle the situation.

For example, the National Transportation Safety Board found that the cause of the fatal Tesla crash in Florida in 2016 was over-reliance on automation, lack of engagement by the driver, and inattention to the roadway. A Tesla also crashed into a road barrier in California while on autopilot, killing its driver. Also Uber experienced a fatal accident in 2017 when one of its vehicles hit a pedestrian in Arizona.

According to the 4,500-member HFES, these are not isolated incidents but, rather they are “symptoms of a serious flaw in the design of autonomy software, which ignores how people’s performance is affected by technology.”

The HFES issued a policy statement with recommendations to guide regulators when considering the introduction of autonomous vehicles. The recommendations fall in four categories:

  • Automated vehicles require careful testing before deployment on public roads.
  • Automated vehicles should support the needs of human drivers and other users.
  • Automated vehicles should be safe and understandable.
  • Automated vehicles should be accompanied by detailed training for drivers.

Before Congress removes safety regulations and allows companies to sell autonomous cars, HFES urges lawmakers to include these requirements to protect the safety of drivers, passengers and pedestrians.

The HFES guidance recommends that testing should ensure that new technology results in safer driving across a wide range of driving conditions. “If the software is not fully reliable and the human driver has to step in, new safety regulations are needed to provide understandable displays that keep the driver aware of what is happening and ensure that they can rapidly take control,” the organization says.

Also, according to the HFES statement, detailed training by manufacturers will be needed to help people better understand how the automaton functions and what are its limitations. “This training will need to be ongoing as software updates over time are made to autonomous vehicles while they sit in drivers’ garages,” the statement adds.

Topics Legislation Personal Auto Training Development

Was this article valuable?

Here are more articles you may enjoy.