Perhaps you missed it amid the muck and mire coming out of Washington and Hollywood these days, but a headline that caught my eye this week was that a self-driving shuttle in Las Vegas was involved in a crash in its first few hours of operation.
It was a minor collision — in fact, collision or crash actually seem hyperbolic, what actually happened was the shuttle bus got a dent in its fender — but as the story went viral on social media it was enough for skeptics of automation to say, “See? Told ya!”
But this “crash” was actually caused by human error, just like 90 percent of all crashes on roadways, on the part of the driver of the delivery truck that backed into it. (Police issued the delivery driver a ticket, by the way.)
City officials in Las Vegas say the shuttle operated exactly as it should have in the situation — its sensors recognized the truck and stopped — and the shuttle was back in operation the next day. But a reporter who was riding on the public shuttle covering its first day on the job had an interesting take: The autonomous vehicle didn’t necessarily do everything a human driver might have.
“We had about 20 feet of empty street behind us (I looked), and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck,” Jeff Zurschmeide wrote on digitaltrends.com. “Or at least leaned on the horn and made our presence harder to miss.”
Of course, this is all correctable stuff. Such is the case with any new technology: There will be glitches. This is why my iPhone is two generations behind (well, that and the exorbitant cost of a new smartphone these days) and I’m always the last one to update my operating system or my apps.
I’m not opposed to change, per se, I just like to make sure stuff actually works the way it’s supposed to before I start incorporating it into my daily life. I’m funny like that.
Fortunately, there are plenty of early adopters who are willing to pay top dollar to be the first ones to own and experience new technology and work out the kinks for the rest of us.
Wednesday’s fender bender of the Vegas shuttle wasn’t the first self-driving car crash, but it was the first involving one operating in public service, a representative for the National Transportation Safety Board told the news service Reuters.
NTSB is investigating the crash with hopes of learning more about “how self-driving vehicles interact with their environment and the other human-driven vehicles around them,” the spokesman said. The agency has already investigated numerous other crashes involving self-driving cars, all of which were found to be the result of human error in at least some capacity.
The most controversial was a May 2016 fatal crash involving a Tesla. The NTSB unanimously agreed in September that the crash was a combination of the driver of the truck the Tesla collided with failing to yield and the Tesla driver’s inattentiveness due to his over-reliance on the autopilot feature. The board subsequently recommended automakers work to ensure semiautonomous systems aren’t misused.
So far, though, all the data seem to indicate that if the only cars on the road were self-driving ones, none of these crashes may have occurred.