Finally, an article which deals with a critical question regarding self-driving cars: will they require isolation in the urban environment and have to be given their own restricted rights-of-way?  Or will other users, especially pedestrians and cyclists, have to be trained and regulated to give autonomous vehicles priority?  

This article deals only with pedestrians and the problem of jay-walking.  But what if the presence of cyclists sharing space with vehicles proves too problematic, especially when the cyclists are an annoyance as much as potential fatalities, to automated vehicles.  Will the progress made toward complete streets and shared spaces be sacrificed in order to facilitate another utopian vision, another variation on Motordom?

 

Whether self-driving cars can correctly identify and avoid pedestrians crossing streets has become a burning issue since March after an Uber self-driving car killed a woman in Arizona who was walking a bicycle across the street at night outside a designated crosswalk. … Meanwhile, other initiatives are losing steam. Elon Musk has shelved plans for an autonomous Tesla to drive across the U.S. Uber has axed a self-driving truck program to focus on autonomous cars. Daimler Trucks, part of Daimler AG, now says commercial driverless trucks will take at least five years. Others, including Musk, had previously predicted such vehicles would be road-ready by 2020.

With these timelines slipping, driverless proponents like Andrew Ng, a machine self-learning researcher, say there’s one surefire shortcut to getting self-driving cars on the streets sooner: persuade pedestrians to behave less erratically. If they use crosswalks, where there are contextual clues—pavement markings and stop lights—the software is more likely to identify them. …

Rodney Brooks, a well-known robotics researcher and an emeritus professor at the Massachusetts Institute of Technology, wrote in a blog post critical of Ng’s sentiments that “the great promise of self-driving cars has been that they will eliminate traffic deaths. Now [Ng] is saying that they will eliminate traffic deaths as long as all humans are trained to change their behavior? What just happened?” …

 

The industry is understandably keen not to be seen offloading the burden onto pedestrians. Uber and Waymo both said in emailed statement that their goal is to develop self-driving cars that can handle the world as it is, without being dependent on changing human behavior.

One challenge for these and other companies is that driverless cars are such a novelty right now, pedestrians don’t always act the way they do around regular vehicles. Some people just can’t suppress the urge to test the technology’s artificial reflexes. Waymo, which is owned by Alphabet Inc., routinely encounters pedestrians who deliberately try to “prank” its cars, continually stepping in front of them, moving away and then stepping back in front of them, to impede their progress.

The assumption seems to be that driverless cars are designed to be extra cautious so the practical joke is worth the risk. “Although our systems do have super-human perception, sometimes people seem to think Newton’s laws no longer apply,” says Paul Newman, the co-founder of Oxbotica, a U.K. startup making autonomous driving software, who recalls the time a pedestrian ran up behind a self-driving car and jumped suddenly in front of it.

Over time driverless cars will become less fascinating, and people will presumably be less likely to prank them. In the meantime, the industry is debating what step companies should take to make humans aware of the cars and their intentions. …

The problem with most of the computer vision systems that self-driving cars use … is they simply put a boundary box around an object and apply a label—parked car, bicycle, person—without the ability to analyze anything happening inside that box.

Eventually, better computer vision systems and better AI may solve this problem. Over time, cities will probably remake themselves for an autonomous age with “geofencing”—a fancy term for creating separate zones and designated pickup spots for self-driving cars and taxis. In the meantime, your parents’ advice probably still applies: Don’t jaywalk and look both ways before crossing the street.

Comments

  1. AI just isn’t what people seem to think it is (it’s not intelligent in any human sense of the word). I think cars and trucks on freeways is feasible, but mixed traffic in urban environments? Forget about it. “AI is safer because we removed all the people it could hit” was always the end-game. Though I think some tech Utopians did drink their own Kool-Aid.

    I expressed my skepticism about driving in mixed traffic to a friend who is an AI researcher. “Oh, there’s one driving the streets of Mumbai right now,” he replied. Whelp, I thought, I wonder what i got wrong? Then the other shoe dropped: “It goes about 3 kilometers an hour.”

    The other problem, in my opinion, is that *if* there was a way to make the cars safe, pedestrians would immediately game the system. If people know how the car behaves and that it’s not going to kill us, they *will* get in the way, and traffic will crawl. The only thing that really keeps streets “safe” for cars is fear of death. (Which just goes to show how messed up motordom is. It brings to my mind that scene from the beginning of A Tale of Two Cities where the ancien régime noble runs down a peasant with his carriage.) The consequence for self-driving cars is stark: either we have safe cars that struggle to get anywhere, or we have deadly ones that further marginalize everyone not driving a polluting hunk of plastic and metal costing tens of thousands of dollars.

    I would put self-driving cars in cities in the same category as electronic voting. It is an inherently bad idea.

  2. Geof wrote: “The other problem, in my opinion, is that *if* there was a way to make the cars safe, pedestrians would immediately game the system. If people know how the car behaves and that it’s not going to kill us, they *will* get in the way, and traffic will crawl. ”

    Very interesting observation!

    But it seems to me that autonomous vehicles are being held to a much higher standard than human drivers are. If a pedestrian walks out from between parked cars looking at his cell phone and is hit by a human driver, then the pedestrian is generally held to be at fault – whereas if it’s an autonomous vehicle then the vehicle seems to get blamed.

    I for one think that we’d be ahead if autonomous vehicles merely reduce the carnage on the road, even if they don’t eliminate it.

Leave a Reply

Your email address will not be published. Required fields are marked *