When Driver Error Becomes Programming Error

The law, technology and self-driving automobiles.
Image
google robocar

Google robocar racetrack ride.

Image credits: jurvetson via Flickr

Joel Shurkin, Contributor

(Inside Science) -- Ninety percent of automobile accidents now involve human error. If scientists succeed in producing computer-driven cars, responsibility may shift to programming errors. In that case, who sues whom? Who is liable?

Scientists working on computer-driven cars -- so-called autonomous vehicles -- are dealing with issues Henry Ford never worried about, and the technology is ahead of the law.

Google has several fully autonomous cars roaming the streets of California.

In some ways it already is happening by steps. Most new cars have anti-lock braking systems, self-adjusting cruise control, traction control, and computers that can slam on the brakes when it detects an imminent collision. The driver just enjoys the ride.

Meanwhile, experimental autonomous cars get better.

Researchers at Stanford University in California reported at the meeting of the American Association for the Advancement of Science in San Jose, California last week that they have produced a car fully capable of competing on a racetrack with a champion amateur driver. The car, a modified Audi TTS, has no driver, but using global positioning systems and computer-driven controls and programmed rules is just as capable. But the car still "thinks" like a computer, not a human driver, raising interesting issues, said Chris Gerdes, professor of mechanical engineering at Stanford.

The car, nicknamed Shelley, was first tested at the Bonneville Salt Flats in Utah, "a good place to test it since there is nothing to hit for miles." Shelley got up to 140 mph.

It ran up Pike's Peak six times by itself, he said.

The most complicated test was at Thunderhill Race Track in Willows, California, where it has reached 120 mph, well within speeds and time recorded by human drivers.

"It's good to look at race car drivers because they use the same physics and mathematics to use the friction between the tires and the road to control the cars to the limits of their capabilities," he said.

The Stanford team matched Shelley against David Vodden, CEO of the track and a champion amateur driver who knew the track intimately. They came within fractions of a second of each other.

Gerdes said at one point, Vodden "cheated" by driving out of the marked lines, making something of a short-cut. Shelley wouldn't do that because it was programmed to obey all the rules, including staying within the lines.

That is an example of the looming problems, Gerdes said.

For instance, imagine you are driving down a street and an illegally parked car is blocking the route. Alongside the parked car are double lines, which under the law cannot be crossed. You can't get around that car without crossing those lines. Almost everyone would just do it, Gerdes said.

But an autonomous car programmed not to break the laws could not. It might just sit there.

The same is true of speed limits. Should you program your car to exceed the legal limits? What is a reasonable speed, the standard often used by police to monitor traffic?

Is the desire for safety greater than our adherence to laws? And what about product liability?

Bryant Walker Smith, professor of law and engineering at the University of South Carolina in Columbia, said the burden of liability is likely to shift more to manufacturers than consumers and the manufacturers will have to pass those costs on to consumers. If the cars are actually safer, however, the number of lawsuits is likely to diminish.

Parsing out liability -- who is responsible for what -- will be far more complicated with these cars, he said.

Drivers for companies such as FedEx and UPS can have accidents, but the costs of that liability haven't stopped them from delivering packages. They pass the costs on to consumers. He expects manufacturers of autonomous cars learn to do the same but a great deal of uncertainty still remains.

Autonomous driving can work for trucks as well. Josh Switkes, of Peloton Technology in Mountain View, California, showed a way of "platooning" large vehicles. One truck gets behind another truck and hooks up electronically. Whatever the driver of the first truck does is duplicated exactly by the following vehicle -- essentially the first driver is driving both. When the first truck slows, the second truck slows exactly at the same time and rate.

Switkes said this improves fuel economy for both trucks and reduces the most common truck accidents.

Both trucks are virtually hooked through the cloud. When the linkup is not possible, the driver of the second truck takes over.

People may expect too much from autonomous vehicles, Smith said. "People want cars that can go everywhere, never crash and let me sleep in the back."

What happens if the autonomous vehicle doesn't meet those expectations? Is it bad design? Has the consumer been misled? Have manufacturers promised more than they can deliver?

Despite the uncertainty, Smith expects that the legal problems will be worked out. It may slow progress to autonomous cars, he said, but it won't stop it.

 

Filed under
Author Bio & Story Archive

Joel Shurkin is a freelance writer in Baltimore who has also taught journalism and science writing.