A collision between a self-driving shuttle and a human-driven truck in Las Vegas shows the inevitability of accidents, and who is more likely to be at fault, as we share the road with robots.
Once again, a driverless vehicle and a human-driven vehicle collided--and once again, the human was at fault.
On Wednesday, a driverless electric shuttle bus debuted in downtown Las Vegas for the first time. Just a few hours later, it was involved in an accident with a delivery truck. The damage was minor, and no one was hurt, according to police.
Police determined that the shuttle came to a stop when it sensed that the truck was trying to back up. But the truck continued to back up, until its tires touched the front of the shuttle.
The truck's driver was ultimately cited for illegal backing.
The shuttle, manufactured by French company NAVYA and sponsored by AAA and Keolis, has the ability to read traffic signals and stop for pedestrians, according to the City of Las Vegas. It underwent a 10-day test in January, with traffic lanes blocked along a designated route in the city. On Wednesday, it had begun offering free rides for up to 11 passengers along a half-mile loop in the Fremont East neighborhood, which was recently designated as part of the city's "innovation district."
The free rides were set to be offered for the next year. The shuttle can only reach a speed of 25 mph, and a human attendant is expected to always be on board to ensure safety.
AAA media relations specialist Mike Blasky took to Twitter after the accident: "Truck making delivery backed into shuttle, which was stopped. Human error causes most traffic collisions and this was no different. Driver of truck was cited. No one hurt except a bruised bumper!"
Testing of the shuttle was shut down for the day, but will continue during the 12-month pilot, according to a statement from the City of Las Vegas.
"The shuttle did what it was supposed to do, in that it's sensors registered the truck and the shuttle stopped to avoid the accident," the city's statement said. "Unfortunately the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided."
This kind of accident will become more and more common, Michael Ramsey, an autonomous vehicle analyst at Gartner, told TechRepublic after an accident between a self-driving Uber and a human driver in Tempe, AZ back in March. "Robots don't drive like humans," Ramsey said. "That's a good thing. Humans are terrible at driving, but other humans know this and adjust our driving to account for what regular people would do on the road. There are many unwritten rules of driving that humans can quickly adjust to that robots will not, and this will lead to accidents."
SEE: IT leader's guide to the future of autonomous vehicles (Tech Pro Research)
If all cars on the road were autonomous, accidents would decline, Ramsey told TechRepublic after the Uber accident. "While they are mixed together, the inflexibility of computers may lead to accidents that wouldn't have happened before even as some other accidents are prevented," he said.
In May 2016, a Tesla driver was killed in an accident while the car was operating in its semi-autonomous Autopilot mode. A US Department of Transportation investigation did not identify any defects in design or performance of the Autopilot system. According to data released by Tesla during the investigation, Autopilot has lowered the number of crashes among its drivers by 40%.
It remains to be seen if these accidents will hinder self-driving efforts moving forward. However, it seems unlikely: The UK government is investing £8.1 million in semi-autonomous trucking trials across its roadways in 2018, and Tesla has teased its autonomous trucks as well.
The 3 big takeaways for TechRepublic readers
1. On Wednesday, a driverless shuttle bus made its debut on the streets of Las Vegas, and got into a minor accident when a delivery truck manned by a human driver backed into it.
2. The delivery truck's driver was cited with illegal backing.
3. Accidents like this will likely continue as long as humans and self-driving cars share the road, experts say.
- Why laws regulating autonomous vehicles are needed now (TechRepublic)
- Self-driving cars vs hackers: Can these eight rules stop security breaches? (ZDNet)
- Elon Musk and the cult of Tesla: How a tech startup rattled the auto industry to its core (TechRepublic)
- US House approves bill to advance autonomous car testing (ZDNet)
- Our autonomous future: How driverless cars will be the first robots we learn to trust (PDF download) (TechRepublic)