Thursday, June 11, 2015

Who's driving now? Ooops, we're still human -- June 11, 2015 column

By MARSHA MERCER

I learned as a rookie newspaper reporter that there are no accidents in life – none involving cars, trucks or motorcycles, anyway.  

“Accident” implies an event that’s beyond our control. Because reporters cannot know what’s in a driver’s mind, we can’t say something is an accident.

Newspapers that shunned the A word reported on automotive mishaps, incidents and collisions. Blow-dried TV announcers and traffic reporters shouting over helicopter noise favored the more vivid noun crash. 

Copy editors weren’t the only sticklers for the right word to describe highway misfortune. Years ago, the National Highway Traffic Safety Administration tried to expunge the concept of traffic accidents.

By eliminating the word in speeches, news releases and publications, it hoped in 1997 to re-educate people that car wrecks weren’t acts of God but predictable and preventable events.

The agency also launched a “Crashes are not accidents” campaign. From time to time, other groups have taken up the cause, but people hang onto the idea of tragic highway accidents. Now, though, while we can’t stop the lethal combination of speed and distance, we at least buckle our seat belts.

Technology is making cars safer. Backup cameras help you park and avoid pedestrians. Lasers or radar in adaptive cruise control keep your car a safe distance from the one ahead and warn if you veer out of your lane.

But what happens when technology – and not a human -- controls the car?    

This summer, Google will test the prototype of its fully self-driving car on city streets in Mountain View, Calif. The cautious two-seater won’t exceed the grand speed of 25 mph. Other companies working on driverless cars include General Motors, Tesla, Mercedes-Benz and the Chinese Web company Baidu.   

States are eager to get on board. Michigan, Florida, Nevada and Washington, D.C., have passed laws allowing automakers to test driverless cars. Virginia Gov. Terry McAuliffe in March proclaimed the commonwealth “open for business” for the testing and deployment of autonomous vehicles.

In Northern Virginia, 70 miles of busy highways – parts of Interstates 66, 95 and 495 and U.S. Routes 29 and 50 -- could be used to test self-driving cars within a year, the Richmond Times-Dispatch reported June 1.
  
The potential for safer highway travel has instant appeal. Autonomous vehicles don’t drive drunk or send texts or take selfies behind the wheel. But they do have to deal with humans in other cars.

Since Google began testing self-driving cars in 2009, the vehicles have logged nearly 2 million miles with a dozen minor accidents – but “Not once was the self-driving car the cause of the accident,” the company announced. Other motorists rear-ended or sideswiped the autonomous car, or the car was in manual mode and its human driver at fault. 

The cars don’t have to be perfect; they just have to beat humans at driving, developers say. But that seems a low bar. Humans are terrible drivers.

In 2010, there were 33,000 traffic deaths, nearly 4 million injuries and 24 million vehicles damaged in motor vehicle crashes in the United States, the highway traffic safety administration reported last month. It still avoids the word accident.

Meanwhile, states are struggling with new rules of the road. California allows self-driving cars on public roads but requires a steering wheel, brake pedal and accelerator – and a trained safety driver who can take over in case there’s a problem.

That may lead to a false sense of security. Google in the fall of 2012 started letting its employees take home self-driving Lexus SUVs. The employees promised to be ever-alert and take the wheel as necessary. It didn’t work out.  

“We discovered something we really hadn’t seen coming but was obvious in retrospect,” Astro Teller, the improbably named director of Google’s Project X, the division that explores innovative products,  said last month in a South by Southwest talk.  

“Once people trust the system, they trust it. Our success was itself a failure,” he said. “The assumption that people can be a reliable backup for the system was a total fallacy.”

The only way a driverless car will work, Google decided, is to make it totally driverless. No steering wheel, no brake pedal, no accelerator. The car must drive itself from Point A to Point B at the push of a button.

“That has been a lot more work than we thought, but it’s the right thing to do,” Teller said.

Google’s decision to adjust the technology to fit the human – rather than expecting the human to adjust to the technology -- is a good lesson for us all. Experiment, yes, and take the time for safety. Crashes are not accidents. 

©2015 Marsha Mercer. All rights reserved.

No comments:

Post a Comment