Modern endpoint backup means real-time data protection. Get it from Code42. Click here.
Welcome Guest | Sign In
TechNewsWorld.com
Code42

Feds Put AI in the Driver's Seat

By Richard Adhikari
Feb 11, 2016 10:19 AM PT
nhtsa-self-driving-vehicles-driver-google-car

The artificial intelligence component of Google's Level 4 autonomous cars can be considered the driver, whether or not the cars are occupied by humans, the U.S. National Highway Transportation Safety Administration said in a letter released Tuesday.

Level 4 full self-driving automation vehicles perform all safety-critical driving functions and monitor roadway conditions for an entire trip.

Google's L4 vehicle design will do away with the steering wheel and the brake and gas pedals.

Current U.S. Federal Motor Vehicle Safety Standards, or FMVSS, don't apply because they were drafted when driver controls and interfaces were the norm and it was assumed the driver would be a human, the NHTSA wrote to Chris Urmson, who heads Google's Self-Driving Car Project.

Those assumptions won't hold as autonomous car technology advances, and the NHTSA may not be able to use its current test procedures to determine compliance with the safety standards.

Google is "the only company so far committed to L4 because their objective is to completely eliminate the human, and thus human error, from driving," said Praveen Chandrasekar, a research manager at Frost & Sullivan.

"Ford and GM are thinking about similar levels" of automation, he told TechNewsWorld.

Safety Standards

Google had provided two suggested interpretations of what a driver is and one for where the driver's seating position is, and then applied these approaches to various provisions so its self-driving vehicle design could be certified compliant with FMVSS.

"The next question is whether and how Google could certify that the SDS (self-driving system) meets a standard developed and designed to apply to a vehicle with a human driver," the NHTSA wrote. It must have a test procedure or other means of verifying such compliance.

The NHTSA's interpretation "is significant, but the burden remains on self-driving car manufacturers to prove that their vehicles meet rigorous federal safety standards," U.S.Transportation Secretary Anthony Foxx said Wednesday.

The NHTSA's interpretation is "outrageous," said John Simpson, a consumer advocate at Consumer Watchdog.

"Google's own numbers reveal its autonomous technology failed 341 times over 15 months, demonstrating that we need a human driver behind the wheel who can take control. You'll recall that the robot technology failed 241 times, and the human driver was scared enough to take control 69 times," he told TechNewsWorld.

Unresolved Issues

Many of Google's requests "present policy issues beyond the scope and limitations of interpretations and thus will need to be addressed using other regulatory tools or approaches," the NHTSA stated.

They include FMVSS No. 135, which governs light vehicle brake systems; FMVSS No. 101, which covers controls and displays; and FMVSS No. 108, governing lamps, reflective devices and associated equipment.

In some cases, Google might be able to show that certain standards are unnecessary for a particular vehicle design, but it hasn't yet made such a showing.

Google may have to seek exemptions to prove its vehicles meet FMVSS standards as an interim step because the NHTSA's interpretations don't fully resolve all the issues raised.

"All kinds of people are working on L4 cars, and there's an indication the NHTSA's going to be relatively accommodating with the granting of exemptions," said Roger Lanctot, an associate research director at Strategy Analytics.

"The orientation of NHTSA is strongly toward taking the driver out of the driver seat," he told TechNewsWorld.

Insurance and Liability

Current FMVSS rules will have to change to accommodate Google's request, which will see changes in auto insurance, Frost & Sullivan's Chandrasekar predicted, because "currently, insurance is decided based largely on the driver and minimally on the vehicle."

Further, liability "is a huge factor, and that will need to be carefully analyzed as OEMs will end up being largely responsible," he said. "This is why OEMs like Volvo, Audi and Mercedes-Benz have stated that in their L3 vehicles they'll assume all liability when the vehicle is driving itself."


Richard Adhikari has written about high-tech for leading industry publications since the 1990s and wonders where it's all leading to. Will implanted RFID chips in humans be the Mark of the Beast? Will nanotech solve our coming food crisis? Does Sturgeon's Law still hold true? You can connect with Richard on Google+.


Facebook Twitter LinkedIn Google+ RSS
Code42
How do you feel about technology and security?
Very insecure -- I would gladly pay extra for better security.
Very insecure -- I'm using technology less as a result.
Very insecure -- but I'm willing to make the trade-off.
Secure enough -- I take reasonable precautions.
Secure enough -- I'm not a likely target.
Very secure -- I trust tech companies to protect me.