tesla-crash-fire-truck-405-los-angeles-2
It is not a good thing to smash your Tesla into the back of a parked fire truck on San Francisco Bay Bridge. A Tesla driver just did and was reported to be twice over the legal level for blood alcohol. His defense? The car was driving. As reported in the Vancouver Sun “according to the California Highway Patrol, the driver explained that his Tesla electric vehicle “had been set on autopilot,” obviating the need for him to be in control of the vehicle or, well, sober.”
The California Highway Patrol nixed that idea, and the driver was sent to jail.  But it also brings up an interesting point~is the car the designated driver if someone is intoxicated? Wasn’t the whole idea of autopilot to give the driver the luxury of paying attention to other things besides driving? Even though cars are like dens on wheels loaded with lots of fun gadgetry, Tesla still states its autopilot system is “not fully autonomous”. You can drive fast, have a good time, but “the company instructs drivers to be alert because they are ultimately responsible for their vehicle and whatever it smacks into.”
In 2016  there was the deadly crash when a Tesla Model S and its driver failed to see a tractor-trailer turning onto a divided highway. The driver had relied on the autopilot, and in the “last 37 minutes of his drive, he had his hands on the wheel for just 25 seconds. He also ignored seven dashboard warnings and six audible warnings.” Some reports have cited that he was watching a  movie  at the time of the crash.
While Elon Musk has been saying that there will be no need for an instrument panel in future vehicles, Tesla crashes have been  attracting interest from investigators. Autopilots were supposed to protect occupants and “even pedestrians” from crashes.
While it has not been determined whether the autopilot was on when the Tesla plowed into the fire truck at over 100 kilometers per hour, the firemen did note that the vehicle was unable to engage the autopilot after. “The car was towed, they said. “No, it didn’t drive itself to the tow yard.”
tesla-crash-fire-truck-405-los-angeles

Comments

    1. People have been treating the effects that their cars have on others as things the car did and not them for years now. How would this be any different?

      1. I think people have been believed for years. Phrases like “it was just an accident” and a systemic lack of accountability for drivers when vulnerable road users are killed by drivers are proof of it.

  1. Unfortunately these events are symptoms of innovation and nothing is designed perfectly from the get-go. So you will continue to hear about the imperfections of autonomous development just like our computers exhibit blue screens still to this day. But it’s the net reduction of these events we want to monitor and if machines are better at the dull but complex task of driving than humans, then they will take over.
    Human distraction is the root of most accidents. You can consider this particular event as a “distracted CPU” in which either the sensor malfunctioned (akin to sun glare in the driver’s eye) or the algorithm does not perform well in this particular situation computervision-radar-laser-GPS fusion of data. So you do the forensics and go back to the drawing board. As long as the number of collisions, limbs, and life lost are reduced relative to the human-driven status quo, autonomous tech will continue on. We have so many things automated these days it is inevitable.

Leave a Reply

Your email address will not be published. Required fields are marked *