"A tragedy tends to focus discussion in a way that a broader speculation cannot," Bryant Walker Smith said. The tragedy he was referring to was the recent fatality that occurred in a Tesla Model S that was operating in Autopilot mode. When neither the driver, Joshua Brown, nor Autopilot—an autonomous driving feature—failed to brake, the car crashed into a tractor trailer, killing the driver.
Smith, a professor at the University of South Carolina, is one of the leading experts on the legal aspects of self-driving vehicles. He was speaking at an event hosted on Wednesday by the Association for Unmanned Vehicle Systems International (AUVSI).
The accident brought attention to the fact that self-driving technology, if just in small part, has become woven into our highway ecosystem. And, many wonder if automakers like Tesla have gone too far releasing new self-driving technology—although we argue that a fuller context is necessary before placing blame for that accident.
But, how close are we, really, to fully self-driving cars?
"The technology is not ready; it's not demonstrated," Smith said. The webinar, which focused on the barriers toward widespread integration of automated vehicles, was a preview of the discussions that will take place at the upcoming Automated Vehicles Symposium, held July 19-21 in San Francisco.
But, Smith pointed out, the technology is only one component of the larger picture. A larger factor is public acceptance.
"Law evolves with society," Smith said. "Details matter; but so does the broader social context." The biggest challenge, he said, is managing expectations.
What happens after a crash or injury, Smith asked? It has the potential to strongly shape public opinion. "If the public interprets these technologies as scary and dangerous, the law will be interpreted that way," he said.
SEE: Why the US government should take Tesla up on its offer to share Autopilot data (TechRepublic)
Smith said that when technology is new, "there's a need to establish a credible case for the technology. As a means of marketing the technologies, preparing the public for crashes, and appropriately responding to accidents."
How to do this? Smith urged developers to share their own safety philosophy.
So, when will the technologies reach a "demonstrated level of acceptable risk?" It's a tough question. "How safe is safe enough? How is safety demonstrated? How confident is confident enough?" Smith asked. "And who decides?"
Josh Switkes, CEO of Peloton Technology, Inc., also spoke about public acceptance, from the self-driving truck perspective. Trucks, he said, are very dangerous today—since they are so heavy, when there is an accident, there's a much greater likelihood of injuries.
Switkes spoke about when the public will be ready to accept driverless trucks.
"Today, people would say they don't want an 80,000 pound death machine powered by a laptop," said Switkes. "But once they see that fully-autonomous vehicles can be much safer than humans, there will be a sudden change—they'll say that they want an autonomous system."
Beyond thinking of autonomous vehicles in terms of "cars" and "trucks," Smith believes that the category of "fully self-driving vehicles will be fundamentally different."
SEE: Autonomous driving levels 0 to 5: Understanding the differences (TechRepublic)
The distinctions, he said, aren't as relevant, and the applications will be different. "Sticking to old categories of the past," Smith said, "won't be useful for predicting all the technologies that will be implemented."
The US Department of Transportation will be looking at regulations for autonomous vehicles this summer. Smith predicts that their guidelines will be very broad, "with many of the details to be filled in later."
The best way for the public, and regulators, to consider the challenges of autonomous driving would be to expect more safety from all vehicles, Smith said.
He urged the public to "reject the status quo," citing that there are 30,000 annual direct deaths from automotive accidents.
The public, Smith said, must "step back and say, what do we really want? What role does automated technology bring to our future?"
"We should be terrified," he said, "of conventional driving."
- How driverless cars will transform auto insurance and shift burden onto AI and software (TechRepublic)
- 'Socially-cooperative' cars are part of the future of driverless vehicles, says CMU professor (TechRepublic)
- Tesla's Elon Musk hints Model 3 will be fully autonomous (ZDNet)
- CES 2016: Carmakers kick off the year with big moves in autonomous vehicles (TechRepublic)
- Tech giants vs. automotive titans: The battle for your car's data (TechRepublic)
Hope Reese has nothing to disclose. She doesn't hold investments in the technology companies she covers.
Hope Reese is a Staff Writer for TechRepublic. She covers the intersection of technology and society, examining the people and ideas that transform how we live today.