Transportation

Consumer Group Worries Over Safety of Google’s Self-Driving Cars

Even though Google's self-driving technology was not at fault in any of 11 minor accidents involving the cars over six years of testing, Consumer Watchdog has raised an alarm over the vehicles' safety, even releasing a video that simulates a serious crash in a tech-failure scenario. The group frequently accuses Google of privacy violations and exerting undue influence on the U.S. government.

Media pressure this week led Google to reveal that its self-driving cars, which are being tested on select city streets in California, have been involved in 11 accidents. All were minor accidents that occurred over the past six years, according to Chris Urmson, director of the self-driving car program.

The disclosure followed an Associated Press report that Google vehicles were involved in three collisions since September, when reporting all accidents involving self-driving cars became mandatory.

Ten of the accidents occurred when other drivers hit the Google cars, Urmson said. Seven were rear-enders, two were side-swipes, and one was caused by someone rolling through a stop sign.

The one accident caused by a Google car occurred when an engineer was driving the vehicle manually and rear-ended another vehicle.

Self-driving cars “don’t overdrive their capability and can’t get distracted, so it’s nearly impossible for them to cause an accident,” said Rob Enderle, principal analyst at the Enderle Group.

“If they see the car in front behaving badly, they would likely slow down, pulling to the side of the road while sending out a law enforcement alert — possibly attaching a film of the other car’s behavior,” he told TechNewsWorld. “They would also alert other self-driving cars in the area of the hazard.”

Time to Show and Tell

Consumer Watchdog earlier this month demanded that Google release full details of an accident involving one of its self-driving cars, and called on it to commit to making all future accident reports public.

Consumer Watchdog on Tuesday held a press event on problems with self-driving cars.

Those problems, according to the group, include bad weather interfering with the vehicles’ sensors; the vehicles’ inability to recognize hand signals; and an inability to recognize road conditions such as large potholes, open manholes or newly installed traffic lights.

Is It Safe?

Google’s 23 self-driving vehicles have driven 1.7 million miles — nearly 1 million of those miles autonomously — and average about 10,000 self-driven miles a week, mostly on city streets, Urmson pointed out.

That’s not as good as it sounds, contended John Simpson, director of Consumer Watchdog’s Privacy Project. Eleven accidents in 1.7 million miles works out to 0.65 accidents per 100,000 miles — about twice the 0.3 property damage accidents per 100,000 miles driven in 2013 reported by the United States National Highway Traffic Safety Administration.

Google has come under fire for not being more forthcoming with details, but “like every other company testing autonomous vehicles on California roads, [Google] shares information on accidents with the DMV as required by regulations,” company spokesperson Katelin Jabbari told TechNewsWorld.

There are, by the way, six other companies testing self-driving cars, and a total of 48 autonomous cars are licensed for testing in California, AP reported.

“Google has an opportunity to set an overall industry standard of behavior for data sharing,” said Roger Lanctot, associate director, global automotive practice, at Strategy Analytics.

“Just as Android is open, the Google car ought to be open,” he told TechNewsWorld. “Otherwise we will not know to what extent we are making, or have made, any progress.”

Making the Roads Safe

The best way to prevent accidents is for automakers to make all cars connected, and “we are working on that,” Lanctot said. Existing wireless technologies can link up vehicles just fine — creating, in effect, an Internet of Things for vehicles.

Nonetheless, there always will be unforeseen circumstances, such as people dashing across streets, or careless drivers making sudden turns, or an oncoming vehicle on the other side flying over the road divider and crashing, and it’s difficult, if not impossible to program for them.

“We don’t teach the car to drive by creating a checklist of 1 million different scenarios it needs to handle and then check each item off,” Google’s Jabbari said. “Rather, you teach the car by giving it fundamental capabilities to respond correctly to different categories of scenarios as they emerge.”

Richard Adhikari

Richard Adhikari has written about high-tech for leading industry publications since the 1990s and wonders where it's all leading to. Will implanted RFID chips in humans be the Mark of the Beast? Will nanotech solve our coming food crisis? Does Sturgeon's Law still hold true? You can connect with Richard on Google+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

Technewsworld Channels