"it was the other driver's fault [Uber] for trying to beat the light and hitting the gas so hard." @Uber
The self-driving Uber involved in an accident in Arizona last Friday was driving through a yellow light when it was hit by a Honda CR-V, according to a police report Business Insider obtained through a Freedom of Information Act request.
The self-driving Uber was not at fault when the accident occurred, according to a spokesperson for the Tempe, Arizona police department. But the incident shows how humans may still be better equipped to handle some complex driving scenarios where the rules of the road shouldn't necessarily apply.
Alexandra Cole, the driver of the Honda CR-V, was attempting to make a left turn across three lanes of traffic when the accident occurred. Cole managed to drive across the first two lanes, which were backed up with cars, and thought she was clear to cross the third.
"As far as I could tell, the third lane had no one coming in it so I was clear to make my turn," Cole said in the police report. "Right as I got to the middle lane about to cross the third I saw a car flying through the intersection, but couldn't brake fast enough to completely avoid collision."
Uber's self-driving Volvo was driving through a yellow light at 38 mph, just below the speed limit, when it was hit by the Honda. The Uber then drove into a traffic pole and flipped onto its side as a result of the collision. No one was seriously injured in the accident.
uber crash report
A diagram of the accident. Vehicle 1 refers to the Honda CR-V while Vehicle 2 refers to the self-driving Uber. Tempe Police
Patrick Murphy, the Uber employee behind the wheel of the self-driving car, wrote in the report that he saw the Honda turning left, but that there was "no time to react" as traffic in the first two lanes had created a blind spot.
In terms of programming, the self-driving Uber did everything by the book. It had the right of way approaching a yellow light, and therefore zipped right through.
But one has to wonder whether a human driver, approaching the busy intersection while the light was turning yellow, may have slowed down.
"We saw the [Honda] car, it was coming fine on her end, but the other person just wanted to beat the light and kept going," a witness wrote in the police report. "All I want to say is it was good on the end of the [Honda] driving toward us, it was the other driver's fault [Uber] for trying to beat the light and hitting the gas so hard."
An Uber spokesperson told Business Insider the self-driving car did not accelerate while approaching the yellow light, but maintained its speed of 38 mph. Uber cars are programmed to always pass through a yellow light at their current speed if there is enough time to make it through the intersection.
Uber vehicle operators are also trained to take over at yellow lights if they don't feel comfortable proceeding through the intersection, the Uber spokesperson said.
Still, the witness' account of the accident raises an interesting point: would a human driver, seeing someone struggling to make a left turn through bumper-to-bumper traffic, have approached that yellow light differently?
uber self-driving car accident Arizona Fresco News/Mark Beach
It's a similar question to the one posed during the National Highway Traffic Safety Administration's investigation into the fatal Tesla Autopilot accident. (Uber has compared its cars' self-driving capabilities to Tesla Autopilot.)
Joshua Brown was killed in May when his Tesla Model S collided with a truck while Autopilot was activated. NHTSA closed the investigation and determined Autopilot was not at fault because Brown had 7 seconds to hit the brakes before the car collided with the truck.
The accident raised concerns that people were beginning to over rely on Level 2 autonomous systems, believing them to be more capable than they actually are.
Following the accident, Consumer Reports called on Tesla to rename Autopilot and disable its hands-free operation to make it clear the system wasn't fully self-driving. A warning will now sound if a Tesla driver takes his or her hands off the wheel while Autopilot is activated.
Both Ford and Waymo, Alphabet's self-driving-car company, have said they are developing fully self-driving cars because they fear people become too complacent in Level 2 systems.
Although Level 2 driving systems come with their risks, they are also much safer than cars on the road today. Crash rates for Tesla vehicles have fallen 40% since Autopilot was first installed in 2015.
But the Uber accident further shows how human drivers can still be better equipped to handle some complex driving situations.
We will never know whether the Arizona accident would have been avoided if a person was controlling the vehicle the entire time, but it highlights how we're still far away from fully capable self-driving vehicles.
No hay comentarios:
Publicar un comentario
Te agradezco tus comentarios. Te esperamos de vuelta.