In the last three years there has been a lot of chest thumping at how autonomous vehicles would infiltrate the market, and how fast this technological change would happen. I have written about the autonomous vehicle that drove across the United States. The vehicle achieved that only on the highway, and had to avoid being autonomous in cities.
While the technology is being developed for the trucking industry as an advanced driving assistance system (ADAS), it is telling that it can only be used on highways. The reason is that this technology called “Copilot” cannot differentiate narrow streets, oncoming traffic, pedestrians and cyclists, all the components at play in a city setting. Despite the claims of autonomous vehicle boosters that the technology is close to being adopted in cities, the sophistication of the systems to recognize and respond to the multitude of discrete movements in a city have still not been developed.
Some of the speakers in the excellent AARP Transportation conference held last week were even more blunt. They posit that the Level Five completely autonomous technology is being developed by software engineers that live in a certain part of California, are used to certain populations of people, and have designed software based upon their own experience of open space and streets.
There have been suggestions that the current technology does not recognize human shadows, and has difficulty recognizing the human form in darker clothes or shapes.
Last year I wrote about the tragic death of a woman walking a bicycle across the road in Tempe Arizona. There is an unfortunate video that shows not only the moment of Ms. Herzberg being struck, but the face of the Uber driver who appears to be looking at a cell phone instead of the road.
Ms. Herzberg was struck and killed by an autonomous vehicle moving at 40 miles an hour (63 km/h) at 10:00 p.m. on a clear and dry evening. But the autonomous technology is only a decade old, and “now starting to experience the unpredictable situations that drivers can face.”
As reported by Kate Conger in The New York Times, the National Transportation Safety Board investigation has now “attributed the crash mostly to human error, but also faulted an “inadequate safety culture” at Uber.” The 46 year old driver apparently was watching a television show on her device at the time of the crash and has been charged with negligent homicide. The Uber driver has entered a not guilty plea.
There is not yet state and federal laws that regulate liability for crashes with autonomous vehicles. In fact states like Arizona perceive the autonomous vehicle as not responsible, saying that Uber would not be liable for the crash.
I previously met the lawyers working on the autonomous vehicle legislation for the European Economic Union (EEC) in Frankfurt. They suggested it would be several years of work to ensure adequate legislation and coverage in Europe for Level Five autonomous vehicle operation.
As the technology develops, one challenge in autonomous vehicles is establishing whether liability rests with the vehicular operating system, the service using/insuring it, or with the driver of the vehicle. It appears that will be a moving target as precedents develop and technology moves forward.
Today, just as in the Alberta example of Tesla occupants sleeping while driving, autonomous vehicles still require a driver who has hands on the steering wheel and paying attention.
As the county attorney in the Tempe Arizona fatality observed:
“Distracted driving is an issue of great importance in our community. When a driver gets behind the wheel of a car, they have a responsibility to control and operate that vehicle safely and in a law-abiding manner.”