50949_WKBW_7_Problem_Solvers_658x90.png

Actions

Rules for self-driving cars are stuck in park

Federal regulations for automation are voluntary
Posted
and last updated

Automakers are racing to develop driverless cars, putting increasingly complex technology on the road despite concerns from safety experts and the National Transportation Safety Board about a lack of regulations.

Unlike rules for the design of a seatbelt or airbag, the federal guidelines for automated vehicle systems are voluntary. The U.S. Department of Transportation says keeping rules at a minimum will speed up the introduction of life-saving technology, a goal made all the more urgent as traffic deaths climbed again last year to 37,461, with 94 percent of those caused by human error.

That lack of mandatory rules for self-driving cars has given automakers and technology companies the green light to police themselves, said Jackie Gillan, president of Advocates for Highway and Auto Safety. The group is calling for the government to issue mandatory safety standards for driverless cars.

“Before we introduce this technology we need to have some assurance and accountability by the industry that this technology is not going to kill or injure consumers,” Gillan said.

The National Transportation Safety Board makes recommendations after investigating major transportation incidents. The board recently called on DOT to issue new safety rules after its investigation of a May 2016 fatal crash of a Tesla Model S operating on autopilot near Williston, Fla. The Tesla slammed into a tractor-trailer, its cameras and automatic emergency braking system failing to spot the blank side of the truck against the white sky.

It was the first known deadly wreck of a car driving with that level of automated sophistication. The NTSB said the driver relied too heavily on the car’s traffic-aware cruise control system and autosteering feature, but also blamed Tesla’s autopilot for allowing the driver to not interact with the car for prolonged periods of time.

Investigators found the driver had his hands on the wheel for only 25 seconds during the 37 minutes the car was on autopilot. After the crash, Tesla updated its software that requires drivers touch the wheel every so often when the car is driving itself to ensure a human is paying attention to the road. Now if a driver repeatedly fails to touch the wheel, he or she will “strike out” and cause the car to slow down and stop in its lane with its hazard lights on, disabling autopilot for the remainder of the trip.

NTSB calls for companies to share information

For the NTSB, the crash investigation was an opportunity to look at safeguards for vehicle automation. The NTSB found there is no requirement or system in place for Tesla to share lessons learned from the crash with other manufacturers of autopilot systems. The board called on the transportation department to develop a way for companies to easily report information about crashes and incidents involving automated systems, like how the FAA requires airlines to share information about airplane defects.

“We don’t think each manufacturer of these vehicles needs to learn these lessons independently,” said Robert Molloy at the NTSB’s Office of Highway Safety.

The NTSB said federal regulators should at least require cars with automation come with systems to keep drivers from relying too much on the technology beyond what it’s designed to do.

“System safeguards that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways were lacking,” NTSB Chairman Robert Sumwalt III said.

The Department of Transportation says it is preparing a response to the National Transportation Safety Board’s call for new safety guidance for automated vehicle systems. In September, the department actually scaled back some of its original recommendations for highly automated vehicles in a push to encourage quicker development of the technology. The guidelines are still not mandatory.

“With rapidly evolving technology, voluntary guidance is the right tool for the right time in that it prioritizes safety while encouraging innovation and discouraging a messy patchwork of state regulations,” a DOT spokesman said in a statement.

Lawmakers try to figure out regulations

Congress is now considering legislation that would require the transportation department to regulate driverless car technology, but there are sticking points. Lawmakers have yet to agree on how to go about writing the safety rules for automated vehicles without tapping the brakes on technology widely believed to be a safety game-changer like a seatbelt.

“How do you not slow down innovation?” said Alan Morrison, associate dean for Public Interest and Service Law at George Washington University Law School. “The trick is to get the right regulations and do it quickly.”

In September, the House of Representatives passed the SELF DRIVE Act. The Senate is considering similar legislation known as the AV START Act. Both bills would require the Department of Transportation to develop new mandatory rules for automated vehicles. The legislation also would require manufacturers to inform drivers about the capabilities and limits of the technology and to certify the safety of their automated systems.

Gillan says the bills still fail to require strong enough safety rules.

“Unfortunately, these bills put too much trust in the hands of the automakers and not enough protections of the public,” she said.

It’s unclear if either bill has enough support to make it to the president’s desk. Lawmakers disagree on how much the federal government should be able to limit state laws for autonomous cars to avoid a patchwork of regulations. And there’s debate about how many vehicles should be able to get exemptions from any new federal rules to allow road testing of technology.

In the meantime, DOT’s National Highway Traffic Safety Administration still has the power to order automakers to recall any car with defective technology, a point former President Barack Obama made when unveiling the first set of autonomous car guidelines, also voluntary.

“Make no mistake: If a self-driving car isn’t safe, we have the authority to pull it off the road. We won’t hesitate to protect the American public’s safety,” Obama wrote in an op-ed.

But without new regulations, Gillan says automakers might bungle their chance to build trust in driverless technology.

“If you’re introducing anything to the marketplace you need to know that there are minimum requirements that everyone is going to meet before they make the public the guinea pigs of their experiments,” she said.

Jeff Turner, who drives a Tesla Model X in suburban Washington, D.C., and trusts the autopilot system, believes automakers won’t risk their reputation on unproven safety systems.

“Beyond their reputation, their customers are at stake and their brand for the car is at stake,” he said. Roads would be safer with more automated systems, even if they’re not yet regulated, he said. “Some of the time when you get a little fatigued after you’ve driven a number of hours it’s still on it, it’s still paying attention 100 percent.”