Transportation

Google Car Stubs Toe

One of Google’s self-driving cars kissed a bus on Valentine’s day, marking the first accident in which the one of the company’s autonomous vehicles was at least partly at fault.

Possibly too smart for its own good, the self-driving car was attempting to reenter its previous lane when it contacted a municipal bus.

The car had pulled over in preparation for a right turn, but came to a stop when it detected sandbags near a storm drain at the intersection, according to Google.

The reason the car pulled to the side before making the right hand turn is that it’s what a human driver would do.

Google was testing a software update that would empower its autonomous vehicles to make such a maneuver, in order to give vehicles behind it the space and opportunity to pass an automobile that was preparing to make a turn.

The car navigated back into the roadway under the assumption that the bus, traveling in the same direction, had enough time to yield to it and would do so. However, that wasn’t the case.

Traveling at under 2 mph, the Google car hit the passing bus, which was traveling at 15 mph. The California DMV estimated that the bus driver had about 3 seconds to respond to the car’s reentry attempt.

The Case for Turning Over the Wheel

In the other collisions involving Google’s driverless cars, only humans were at fault. This time, the car was operating in autonomous mode.

The impact of the collision may be much more severe than the damage the driverless car suffered to its front-left fender, front-left wheel and one of its sensors. The accident is a hit to the reputation of driverless vehicles.

The average consumer typically hears about driverless cars only when the vehicles have been involved in collisions, which hasn’t done much for the technology’s reputation, said Jack Nerad, executive market analyst for Kelley Blue Book.

“There is also the issue of control,” he told TechNewsWorld.

Many people believe their driving skills are better than others’ are, Nerad said. “Thus, many folks would prefer to do it themselves versus letting an unknown machine take over. We’ve all experienced computer crashes, but a computer crash in an autonomous car could lead to a genuine life-threatening crash.”

In a driverless near future, things would have played out a little differently than the test car crash. Had both the bus and car been manned by machines, it would have made fault-finding a lot easier, suggested Charles King, principal analyst for Pund-IT.

“It also brings up issues of liability and possible punishment fit for a TV sitcom,” King told TechNewsWorld. “Who made the arrest? Robocop on traffic detail? What would a jury of peers consist of? Laptops, smartphones and tablets? If the accused is Android based, would it be fair for the judge to be an iPad?”

More Than Human

The common feeling among drivers that their skills are superior to those of other drivers may play a large role in the public’s attitude toward driverless vehicles.

“Safer operation is a selling point, but what is not generally reported is that autonomously driven vehicles will still be involved in collisions,” Kelley’s Nerad said. “Assessing fault will be an issue on those occasions, and that is just one of many issues that stand in the path of autonomous vehicle adoption.”

The journey, deep into uncharted territory, has yielded new insights into the art of driving and has helped engineers emulate some of the behaviors that make human drivers so efficient — when they aren’t clogging up a city’s arteries behind a traffic incident.

An exploration of those behaviors led to the right-turn procedure that the driverless car attempted when it pulled out in front of that bus. However, it’s reasonable to expect that not every tactic tested will succeed without a stumble.

“Those of us who drive in urban traffic every day know that expecting a transit bus to yield even when the law says it should is expecting too much,” said Nerad.

Further, “on top of the inane superiority complex in human drivers, there has been, historically, a fear of machines that ‘display or achieve human-like qualities,'” King pointed out.

“The HAL computer is a classic example, but there are hundreds of others,” he noted.

“A common attitude or conceit is that whatever the case, a person will always be a better, wiser and more empathetic decision maker,” King said. “That applies to complex moral issues but can also relate to physical actions — including driving.”

Quinten Plummer is a longtime technology reporter and an avid PC gamer who explored local news for a few years, covering law enforcement and government beats, before returning to writing about things run by ones and zeros and the people who make them. If it pushes pixels or improves lives, he wants to learn all he can about it.

1 Comment

  • The first rule of being on the road – according to my parents when I first learned to ride a bike – was: "Treat every other vehicle on the road like an idiot out to get you".

    I think self-driving cars need to understand that not every human driver is capable of choosing from multiple decisions when you have only a three second gap to do what is correct.

    Basically: "What the hell? No way fella. Oh crap, you did. You moron." – There is your human three seconds of thought.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels